Oct 06 08:19:07 crc systemd[1]: Starting Kubernetes Kubelet... Oct 06 08:19:07 crc restorecon[4739]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:07 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:19:08 crc restorecon[4739]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 08:19:08 crc restorecon[4739]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 06 08:19:08 crc kubenswrapper[4991]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 08:19:08 crc kubenswrapper[4991]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 06 08:19:08 crc kubenswrapper[4991]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 08:19:08 crc kubenswrapper[4991]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 08:19:08 crc kubenswrapper[4991]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 06 08:19:08 crc kubenswrapper[4991]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.982535 4991 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987750 4991 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987780 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987787 4991 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987794 4991 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987804 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987812 4991 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987828 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987838 4991 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987845 4991 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987852 4991 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987859 4991 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987866 4991 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987872 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987879 4991 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987884 4991 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987891 4991 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987898 4991 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987904 4991 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987910 4991 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987917 4991 feature_gate.go:330] unrecognized feature gate: Example Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987923 4991 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987933 4991 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987941 4991 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987947 4991 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987954 4991 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987961 4991 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987967 4991 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987973 4991 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987980 4991 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987986 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987993 4991 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.987999 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988005 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988011 4991 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988017 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988027 4991 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988037 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988045 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988053 4991 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988062 4991 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988069 4991 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988076 4991 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988085 4991 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988093 4991 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988100 4991 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988108 4991 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988115 4991 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988122 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988128 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988136 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988144 4991 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988151 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988158 4991 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988165 4991 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988172 4991 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988179 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988189 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988195 4991 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988203 4991 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988211 4991 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988219 4991 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988226 4991 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988233 4991 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988239 4991 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988245 4991 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988251 4991 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988257 4991 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988265 4991 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988272 4991 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988279 4991 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.988286 4991 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988452 4991 flags.go:64] FLAG: --address="0.0.0.0" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988469 4991 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988488 4991 flags.go:64] FLAG: --anonymous-auth="true" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988499 4991 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988513 4991 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988520 4991 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988532 4991 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988541 4991 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988549 4991 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988557 4991 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988566 4991 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988573 4991 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988581 4991 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988589 4991 flags.go:64] FLAG: --cgroup-root="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988597 4991 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988606 4991 flags.go:64] FLAG: --client-ca-file="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988613 4991 flags.go:64] FLAG: --cloud-config="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988620 4991 flags.go:64] FLAG: --cloud-provider="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988627 4991 flags.go:64] FLAG: --cluster-dns="[]" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988636 4991 flags.go:64] FLAG: --cluster-domain="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988644 4991 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988652 4991 flags.go:64] FLAG: --config-dir="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988659 4991 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988667 4991 flags.go:64] FLAG: --container-log-max-files="5" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988677 4991 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988685 4991 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988692 4991 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988700 4991 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988708 4991 flags.go:64] FLAG: --contention-profiling="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988716 4991 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988724 4991 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988732 4991 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988742 4991 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988752 4991 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988759 4991 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988767 4991 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988774 4991 flags.go:64] FLAG: --enable-load-reader="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988782 4991 flags.go:64] FLAG: --enable-server="true" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988789 4991 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988800 4991 flags.go:64] FLAG: --event-burst="100" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988807 4991 flags.go:64] FLAG: --event-qps="50" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988815 4991 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988823 4991 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988830 4991 flags.go:64] FLAG: --eviction-hard="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988841 4991 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988848 4991 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988855 4991 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988863 4991 flags.go:64] FLAG: --eviction-soft="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988871 4991 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988878 4991 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988885 4991 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988892 4991 flags.go:64] FLAG: --experimental-mounter-path="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988900 4991 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988907 4991 flags.go:64] FLAG: --fail-swap-on="true" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988914 4991 flags.go:64] FLAG: --feature-gates="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988923 4991 flags.go:64] FLAG: --file-check-frequency="20s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988931 4991 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988939 4991 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988947 4991 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988954 4991 flags.go:64] FLAG: --healthz-port="10248" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988962 4991 flags.go:64] FLAG: --help="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988969 4991 flags.go:64] FLAG: --hostname-override="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988977 4991 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988985 4991 flags.go:64] FLAG: --http-check-frequency="20s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.988993 4991 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989001 4991 flags.go:64] FLAG: --image-credential-provider-config="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989008 4991 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989015 4991 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989024 4991 flags.go:64] FLAG: --image-service-endpoint="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989031 4991 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989039 4991 flags.go:64] FLAG: --kube-api-burst="100" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989046 4991 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989054 4991 flags.go:64] FLAG: --kube-api-qps="50" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989061 4991 flags.go:64] FLAG: --kube-reserved="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989069 4991 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989076 4991 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989084 4991 flags.go:64] FLAG: --kubelet-cgroups="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989090 4991 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989098 4991 flags.go:64] FLAG: --lock-file="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989105 4991 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989113 4991 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989121 4991 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989133 4991 flags.go:64] FLAG: --log-json-split-stream="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989140 4991 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989149 4991 flags.go:64] FLAG: --log-text-split-stream="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989156 4991 flags.go:64] FLAG: --logging-format="text" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989164 4991 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989173 4991 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989180 4991 flags.go:64] FLAG: --manifest-url="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989187 4991 flags.go:64] FLAG: --manifest-url-header="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989198 4991 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989205 4991 flags.go:64] FLAG: --max-open-files="1000000" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989215 4991 flags.go:64] FLAG: --max-pods="110" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989223 4991 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989232 4991 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989240 4991 flags.go:64] FLAG: --memory-manager-policy="None" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989256 4991 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989263 4991 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989270 4991 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989278 4991 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989318 4991 flags.go:64] FLAG: --node-status-max-images="50" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989326 4991 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989333 4991 flags.go:64] FLAG: --oom-score-adj="-999" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989341 4991 flags.go:64] FLAG: --pod-cidr="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989348 4991 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989360 4991 flags.go:64] FLAG: --pod-manifest-path="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989366 4991 flags.go:64] FLAG: --pod-max-pids="-1" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989372 4991 flags.go:64] FLAG: --pods-per-core="0" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989378 4991 flags.go:64] FLAG: --port="10250" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989385 4991 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989392 4991 flags.go:64] FLAG: --provider-id="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989400 4991 flags.go:64] FLAG: --qos-reserved="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989407 4991 flags.go:64] FLAG: --read-only-port="10255" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989414 4991 flags.go:64] FLAG: --register-node="true" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989422 4991 flags.go:64] FLAG: --register-schedulable="true" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989429 4991 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989442 4991 flags.go:64] FLAG: --registry-burst="10" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989448 4991 flags.go:64] FLAG: --registry-qps="5" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989454 4991 flags.go:64] FLAG: --reserved-cpus="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989468 4991 flags.go:64] FLAG: --reserved-memory="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989477 4991 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989483 4991 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989489 4991 flags.go:64] FLAG: --rotate-certificates="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989495 4991 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989501 4991 flags.go:64] FLAG: --runonce="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989507 4991 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989513 4991 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989519 4991 flags.go:64] FLAG: --seccomp-default="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989525 4991 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989533 4991 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989539 4991 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989545 4991 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989552 4991 flags.go:64] FLAG: --storage-driver-password="root" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989558 4991 flags.go:64] FLAG: --storage-driver-secure="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989563 4991 flags.go:64] FLAG: --storage-driver-table="stats" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989570 4991 flags.go:64] FLAG: --storage-driver-user="root" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989577 4991 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989583 4991 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989589 4991 flags.go:64] FLAG: --system-cgroups="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989595 4991 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989604 4991 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989610 4991 flags.go:64] FLAG: --tls-cert-file="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989616 4991 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989623 4991 flags.go:64] FLAG: --tls-min-version="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989629 4991 flags.go:64] FLAG: --tls-private-key-file="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989634 4991 flags.go:64] FLAG: --topology-manager-policy="none" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989640 4991 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989647 4991 flags.go:64] FLAG: --topology-manager-scope="container" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989655 4991 flags.go:64] FLAG: --v="2" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989665 4991 flags.go:64] FLAG: --version="false" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989675 4991 flags.go:64] FLAG: --vmodule="" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989688 4991 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 06 08:19:08 crc kubenswrapper[4991]: I1006 08:19:08.989696 4991 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.989872 4991 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.989882 4991 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.989890 4991 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.989896 4991 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.989903 4991 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.989908 4991 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.989914 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.989920 4991 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 08:19:08 crc kubenswrapper[4991]: W1006 08:19:08.989930 4991 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989936 4991 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989942 4991 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989947 4991 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989953 4991 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989958 4991 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989963 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989969 4991 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989975 4991 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989980 4991 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989985 4991 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989989 4991 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989995 4991 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.989999 4991 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990004 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990010 4991 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990015 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990020 4991 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990025 4991 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990030 4991 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990036 4991 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990042 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990051 4991 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990057 4991 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990064 4991 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990071 4991 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990078 4991 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990085 4991 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990091 4991 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990096 4991 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990101 4991 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990106 4991 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990114 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990119 4991 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990124 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990130 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990134 4991 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990139 4991 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990145 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990150 4991 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990157 4991 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990163 4991 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990169 4991 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990174 4991 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990180 4991 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990184 4991 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990189 4991 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990194 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990199 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990204 4991 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990210 4991 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990219 4991 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990227 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990234 4991 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990244 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990252 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990259 4991 feature_gate.go:330] unrecognized feature gate: Example Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990266 4991 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990272 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990277 4991 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990283 4991 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990287 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:08.990314 4991 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:08.990337 4991 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.003661 4991 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.003696 4991 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003821 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003835 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003844 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003853 4991 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003861 4991 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003869 4991 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003876 4991 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003884 4991 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003892 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003900 4991 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003908 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003916 4991 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003924 4991 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003932 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003941 4991 feature_gate.go:330] unrecognized feature gate: Example Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003949 4991 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003956 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003964 4991 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003975 4991 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003988 4991 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.003999 4991 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004010 4991 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004021 4991 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004030 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004043 4991 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004054 4991 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004064 4991 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004074 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004083 4991 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004092 4991 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004100 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004108 4991 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004116 4991 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004123 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004134 4991 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004142 4991 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004150 4991 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004158 4991 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004166 4991 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004174 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004182 4991 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004190 4991 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004197 4991 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004205 4991 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004212 4991 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004220 4991 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004228 4991 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004236 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004247 4991 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004256 4991 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004265 4991 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004274 4991 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004283 4991 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004291 4991 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004324 4991 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004333 4991 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004341 4991 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004349 4991 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004357 4991 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004365 4991 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004374 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004381 4991 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004389 4991 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004397 4991 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004404 4991 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004412 4991 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004420 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004427 4991 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004435 4991 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004443 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004452 4991 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.004465 4991 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004679 4991 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004694 4991 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004703 4991 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004713 4991 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004723 4991 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004735 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004745 4991 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004753 4991 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004763 4991 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004771 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004780 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004789 4991 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004798 4991 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004806 4991 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004814 4991 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004822 4991 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004830 4991 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004839 4991 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004847 4991 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004855 4991 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004863 4991 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004871 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004881 4991 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004891 4991 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004902 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004911 4991 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004919 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004927 4991 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004935 4991 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004943 4991 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004951 4991 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004959 4991 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004967 4991 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004974 4991 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004983 4991 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.004992 4991 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005000 4991 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005007 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005016 4991 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005023 4991 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005032 4991 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005039 4991 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005047 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005054 4991 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005062 4991 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005070 4991 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005078 4991 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005086 4991 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005093 4991 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005101 4991 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005112 4991 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005122 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005130 4991 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005139 4991 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005148 4991 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005156 4991 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005165 4991 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005173 4991 feature_gate.go:330] unrecognized feature gate: Example Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005181 4991 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005188 4991 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005197 4991 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005205 4991 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005214 4991 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005222 4991 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005229 4991 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005237 4991 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005245 4991 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005253 4991 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005261 4991 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005269 4991 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.005278 4991 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.005290 4991 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.005541 4991 server.go:940] "Client rotation is on, will bootstrap in background" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.011751 4991 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.011927 4991 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.013505 4991 server.go:997] "Starting client certificate rotation" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.013553 4991 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.015689 4991 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-11 15:47:31.495579008 +0000 UTC Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.015837 4991 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2335h28m22.479747483s for next certificate rotation Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.047050 4991 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.054010 4991 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.072862 4991 log.go:25] "Validated CRI v1 runtime API" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.103361 4991 log.go:25] "Validated CRI v1 image API" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.106250 4991 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.113617 4991 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-06-08-14-33-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.113688 4991 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.132810 4991 manager.go:217] Machine: {Timestamp:2025-10-06 08:19:09.13072878 +0000 UTC m=+0.868478811 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a9848c46-d1c6-4335-aa9d-2c0df75a6fc7 BootID:fdc65aba-65bf-4101-b45c-7ba497b89a18 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c8:c4:96 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c8:c4:96 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e1:18:ae Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d9:73:5b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2e:7d:b3 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:2f:4c:7a Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:b9:64:15 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e2:d7:56:4d:dc:9f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:b6:39:84:3c:ba:71 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.133065 4991 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.133242 4991 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.133641 4991 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.133822 4991 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.133881 4991 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.135339 4991 topology_manager.go:138] "Creating topology manager with none policy" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.135360 4991 container_manager_linux.go:303] "Creating device plugin manager" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.136191 4991 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.136218 4991 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.136447 4991 state_mem.go:36] "Initialized new in-memory state store" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.136579 4991 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.141011 4991 kubelet.go:418] "Attempting to sync node with API server" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.141034 4991 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.141049 4991 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.141064 4991 kubelet.go:324] "Adding apiserver pod source" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.141078 4991 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.150274 4991 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.151257 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.151265 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.151451 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.61:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.151513 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.61:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.151902 4991 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.157528 4991 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.159712 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.159759 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.159775 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.159790 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.159813 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.159827 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.159842 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.159866 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.159883 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.159899 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.159919 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.159934 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.161915 4991 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.162856 4991 server.go:1280] "Started kubelet" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.164482 4991 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:09 crc systemd[1]: Started Kubernetes Kubelet. Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.166792 4991 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.166958 4991 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.168237 4991 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.170183 4991 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.170224 4991 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.170258 4991 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 08:18:49.099308378 +0000 UTC Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.170381 4991 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2279h59m39.928952597s for next certificate rotation Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.170666 4991 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.170746 4991 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.170809 4991 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.170857 4991 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.171653 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.61:6443: connect: connection refused" interval="200ms" Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.171912 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.172222 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.61:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.172763 4991 server.go:460] "Adding debug handlers to kubelet server" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.174669 4991 factory.go:153] Registering CRI-O factory Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.174734 4991 factory.go:221] Registration of the crio container factory successfully Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.174810 4991 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.174823 4991 factory.go:55] Registering systemd factory Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.174832 4991 factory.go:221] Registration of the systemd container factory successfully Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.174858 4991 factory.go:103] Registering Raw factory Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.174879 4991 manager.go:1196] Started watching for new ooms in manager Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.175548 4991 manager.go:319] Starting recovery of all containers Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.175466 4991 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.61:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186bd911a156e55f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-06 08:19:09.162808671 +0000 UTC m=+0.900558722,LastTimestamp:2025-10-06 08:19:09.162808671 +0000 UTC m=+0.900558722,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.194833 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.194959 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195013 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195046 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195072 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195103 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195125 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195153 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195186 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195217 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195242 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195491 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195518 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195558 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195583 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195603 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195623 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195643 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195664 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195689 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195746 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195768 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195790 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195811 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195831 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195854 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195879 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195913 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195944 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.195972 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196005 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196024 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196046 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196118 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196148 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196174 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196203 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196230 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196259 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196281 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196329 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196591 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.196618 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.201648 4991 manager.go:324] Recovery completed Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.202627 4991 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.202747 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.202823 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.202882 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.202939 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203010 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203077 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203140 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203202 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203258 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203355 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203424 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203496 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203557 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203616 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203681 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203745 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203845 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203910 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.203996 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204063 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204139 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204200 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204261 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204338 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204404 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204475 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204535 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204591 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204648 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204712 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204777 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204837 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204895 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.204956 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205023 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205085 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205144 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205202 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205259 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205362 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205448 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205514 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205575 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205636 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205696 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205761 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205829 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205891 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.205950 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206012 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206070 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206142 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206216 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206314 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206375 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206451 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206549 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206636 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206718 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206806 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206887 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.206977 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207066 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207137 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207197 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207258 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207335 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207406 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207471 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207529 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207591 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207678 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207757 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207824 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207893 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.207997 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208060 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208122 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208188 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208246 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208321 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208385 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208443 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208517 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208580 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208642 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208704 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208769 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208826 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208889 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.208948 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209006 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209064 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209127 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209196 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209258 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209332 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209392 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209459 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209527 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209587 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209652 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209710 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209768 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209829 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209888 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.209973 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210076 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210142 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210202 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210265 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210350 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210420 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210499 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210562 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210626 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210720 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210785 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210846 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210903 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.210966 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.211025 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.211081 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.211148 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.211204 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.211839 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.211913 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.211981 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.212039 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.212094 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.212151 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.212213 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.212275 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.212363 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.212455 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.212531 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.212609 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.212717 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.212796 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.212949 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.213034 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.213119 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.213205 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.213286 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.213388 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.213458 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.213539 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.213614 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.213677 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.213735 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.213794 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.213850 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214128 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214194 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214269 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214344 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214405 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214478 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214545 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214601 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214656 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214712 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214769 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214831 4991 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214893 4991 reconstruct.go:97] "Volume reconstruction finished" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.214946 4991 reconciler.go:26] "Reconciler: start to sync state" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.220556 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.222804 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.222921 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.223022 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.227398 4991 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.227547 4991 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.227635 4991 state_mem.go:36] "Initialized new in-memory state store" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.238904 4991 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.241462 4991 policy_none.go:49] "None policy: Start" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.242292 4991 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.242422 4991 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.242507 4991 kubelet.go:2335] "Starting kubelet main sync loop" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.242656 4991 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.242741 4991 state_mem.go:35] "Initializing new in-memory state store" Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.242727 4991 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.244603 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.244749 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.61:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.271131 4991 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.305552 4991 manager.go:334] "Starting Device Plugin manager" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.305638 4991 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.305661 4991 server.go:79] "Starting device plugin registration server" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.306646 4991 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.306678 4991 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.306966 4991 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.307076 4991 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.307097 4991 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.314700 4991 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.343521 4991 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.343651 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.345493 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.345565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.345579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.345815 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.346046 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.346093 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.347404 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.347441 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.347451 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.347536 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.347642 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.347689 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.347756 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.348017 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.348109 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.349322 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.349438 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.349455 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.349554 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.349621 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.349654 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.351817 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.352179 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.352255 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.353487 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.353621 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.353691 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.354745 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.354935 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.354981 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.355044 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.354992 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.355135 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.356550 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.356648 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.356731 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.356578 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.356867 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.356898 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.357353 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.357414 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.358708 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.358743 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.358759 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.372451 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.61:6443: connect: connection refused" interval="400ms" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.406910 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.408453 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.408533 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.408556 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.408597 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.409338 4991 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.61:6443: connect: connection refused" node="crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.417810 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.417871 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.417909 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.417941 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.417981 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.418021 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.418053 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.418084 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.418115 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.418149 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.418182 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.418231 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.418264 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.418321 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.418449 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519390 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519477 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519511 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519546 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519581 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519613 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519646 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519689 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519708 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519750 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519771 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519808 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519721 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519641 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519770 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519731 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519691 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519995 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519968 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.519729 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.520042 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.520070 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.520137 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.520210 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.520245 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.520254 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.520277 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.520318 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.520285 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.520365 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.609925 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.613601 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.613684 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.613712 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.613765 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.614596 4991 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.61:6443: connect: connection refused" node="crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.682692 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.690208 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.711966 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.740188 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: I1006 08:19:09.746273 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:19:09 crc kubenswrapper[4991]: E1006 08:19:09.773936 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.61:6443: connect: connection refused" interval="800ms" Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.806981 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d6c3d6034fc7997a6aa91616293992b0892ffdebf957282382be4b29cd586758 WatchSource:0}: Error finding container d6c3d6034fc7997a6aa91616293992b0892ffdebf957282382be4b29cd586758: Status 404 returned error can't find the container with id d6c3d6034fc7997a6aa91616293992b0892ffdebf957282382be4b29cd586758 Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.810386 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-cfccc3d64316f6738a4f7061d50a88b3e2b62772c8678e9db2640756a797ff0e WatchSource:0}: Error finding container cfccc3d64316f6738a4f7061d50a88b3e2b62772c8678e9db2640756a797ff0e: Status 404 returned error can't find the container with id cfccc3d64316f6738a4f7061d50a88b3e2b62772c8678e9db2640756a797ff0e Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.813460 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-f8b8954d22a59319089a6b418ff469ef20183ec895e09b9911f36071f5565bff WatchSource:0}: Error finding container f8b8954d22a59319089a6b418ff469ef20183ec895e09b9911f36071f5565bff: Status 404 returned error can't find the container with id f8b8954d22a59319089a6b418ff469ef20183ec895e09b9911f36071f5565bff Oct 06 08:19:09 crc kubenswrapper[4991]: W1006 08:19:09.821238 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-afb6faf912e7e1ad8b788462e012ee4e016fc822aa31f1bb5e067f5d51a3ee9e WatchSource:0}: Error finding container afb6faf912e7e1ad8b788462e012ee4e016fc822aa31f1bb5e067f5d51a3ee9e: Status 404 returned error can't find the container with id afb6faf912e7e1ad8b788462e012ee4e016fc822aa31f1bb5e067f5d51a3ee9e Oct 06 08:19:10 crc kubenswrapper[4991]: W1006 08:19:10.006765 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:10 crc kubenswrapper[4991]: E1006 08:19:10.006954 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.61:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.015496 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.017735 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.018255 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.018318 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.018360 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:19:10 crc kubenswrapper[4991]: E1006 08:19:10.018856 4991 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.61:6443: connect: connection refused" node="crc" Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.166032 4991 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:10 crc kubenswrapper[4991]: W1006 08:19:10.252448 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:10 crc kubenswrapper[4991]: E1006 08:19:10.252542 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.61:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.253449 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f8b8954d22a59319089a6b418ff469ef20183ec895e09b9911f36071f5565bff"} Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.255041 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cfccc3d64316f6738a4f7061d50a88b3e2b62772c8678e9db2640756a797ff0e"} Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.256348 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d6c3d6034fc7997a6aa91616293992b0892ffdebf957282382be4b29cd586758"} Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.257555 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"afb6faf912e7e1ad8b788462e012ee4e016fc822aa31f1bb5e067f5d51a3ee9e"} Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.258631 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"02c211fc33563c0888946bfe738dbf23258f6c0166aee4193b53355e266bebb3"} Oct 06 08:19:10 crc kubenswrapper[4991]: W1006 08:19:10.351260 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:10 crc kubenswrapper[4991]: E1006 08:19:10.351425 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.61:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:19:10 crc kubenswrapper[4991]: W1006 08:19:10.481672 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:10 crc kubenswrapper[4991]: E1006 08:19:10.481822 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.61:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:19:10 crc kubenswrapper[4991]: E1006 08:19:10.576792 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.61:6443: connect: connection refused" interval="1.6s" Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.819271 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.821269 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.821363 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.821412 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:10 crc kubenswrapper[4991]: I1006 08:19:10.821451 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:19:10 crc kubenswrapper[4991]: E1006 08:19:10.822640 4991 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.61:6443: connect: connection refused" node="crc" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.166407 4991 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.265122 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa"} Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.265194 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d"} Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.268570 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"9d125182810217335e9e760bad80f33e4018c631aaf4dfc1374950a888102ca6"} Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.268580 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.268499 4991 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="9d125182810217335e9e760bad80f33e4018c631aaf4dfc1374950a888102ca6" exitCode=0 Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.270081 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.270123 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.270134 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.273290 4991 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde" exitCode=0 Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.273453 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.273464 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde"} Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.275404 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.275450 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.275462 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.276647 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e" exitCode=0 Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.276706 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.276766 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e"} Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.277650 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.277689 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.277706 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.279598 4991 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00" exitCode=0 Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.279677 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00"} Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.279731 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.280047 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.281437 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.281470 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.281440 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.281483 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.281501 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:11 crc kubenswrapper[4991]: I1006 08:19:11.281517 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:11 crc kubenswrapper[4991]: W1006 08:19:11.814801 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:11 crc kubenswrapper[4991]: E1006 08:19:11.814933 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.61:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.165366 4991 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:12 crc kubenswrapper[4991]: E1006 08:19:12.178242 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.61:6443: connect: connection refused" interval="3.2s" Oct 06 08:19:12 crc kubenswrapper[4991]: W1006 08:19:12.225988 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:12 crc kubenswrapper[4991]: E1006 08:19:12.226090 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.61:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.284289 4991 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf" exitCode=0 Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.284399 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf"} Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.284466 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.285408 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.285441 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.285453 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.287269 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c"} Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.287317 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3"} Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.287334 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08"} Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.287346 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257"} Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.291685 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204"} Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.291758 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c"} Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.291780 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4"} Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.291791 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.293162 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.293205 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.293217 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.295893 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.295908 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f"} Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.295966 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96"} Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.296897 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.296933 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.296943 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.298761 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7fae28e1f9e34b6670b19842581b89981626f77f1e3cec07a7c9a4610557c86d"} Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.298839 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.300076 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.300097 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.300107 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.423047 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.424565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.424659 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.424669 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:12 crc kubenswrapper[4991]: I1006 08:19:12.424708 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:19:12 crc kubenswrapper[4991]: E1006 08:19:12.425257 4991 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.61:6443: connect: connection refused" node="crc" Oct 06 08:19:12 crc kubenswrapper[4991]: W1006 08:19:12.792698 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:12 crc kubenswrapper[4991]: E1006 08:19:12.792811 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.61:6443: connect: connection refused" logger="UnhandledError" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.165611 4991 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.61:6443: connect: connection refused Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.304254 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.313561 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1d8e1e9f243640c0da720d74f5350f5c761505efc4d08baeea028d84d3503f5e" exitCode=255 Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.313681 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1d8e1e9f243640c0da720d74f5350f5c761505efc4d08baeea028d84d3503f5e"} Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.313744 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.315096 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.315142 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.315153 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.316083 4991 scope.go:117] "RemoveContainer" containerID="1d8e1e9f243640c0da720d74f5350f5c761505efc4d08baeea028d84d3503f5e" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.317956 4991 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61" exitCode=0 Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.318003 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61"} Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.318119 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.318134 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.318150 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.318154 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.318121 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.319620 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.319654 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.319664 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.319743 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.319773 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.319785 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.319789 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.319813 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.319827 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.319843 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.319908 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.319931 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.566843 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:13 crc kubenswrapper[4991]: I1006 08:19:13.805771 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.325420 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1"} Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.325473 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383"} Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.325486 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1"} Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.328283 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.330527 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.330686 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1"} Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.330823 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.331624 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.331684 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.331702 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.331925 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.331973 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.331993 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.429356 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.429665 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.431492 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.431545 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.431561 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.689012 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:14 crc kubenswrapper[4991]: I1006 08:19:14.752283 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.339813 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75"} Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.339888 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.339888 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3"} Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.340081 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.340115 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.340213 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.341262 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.341354 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.341376 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.341904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.341930 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.341968 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.341990 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.341945 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.342031 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.626017 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.628059 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.628134 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.628154 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.628205 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:19:15 crc kubenswrapper[4991]: I1006 08:19:15.649874 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.186723 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.196816 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.342504 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.342644 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.342679 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.348397 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.348450 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.348461 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.348569 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.348398 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.348691 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.348716 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.348513 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:16 crc kubenswrapper[4991]: I1006 08:19:16.348800 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.010850 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.345463 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.345580 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.346990 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.347064 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.347081 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.347644 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.348047 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.348066 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.384382 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.384647 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.385927 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.385975 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.385988 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.429845 4991 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 08:19:17 crc kubenswrapper[4991]: I1006 08:19:17.429959 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 08:19:18 crc kubenswrapper[4991]: I1006 08:19:18.352157 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:18 crc kubenswrapper[4991]: I1006 08:19:18.354273 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:18 crc kubenswrapper[4991]: I1006 08:19:18.354353 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:18 crc kubenswrapper[4991]: I1006 08:19:18.354371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:18 crc kubenswrapper[4991]: I1006 08:19:18.547791 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 06 08:19:18 crc kubenswrapper[4991]: I1006 08:19:18.548059 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:18 crc kubenswrapper[4991]: I1006 08:19:18.550008 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:18 crc kubenswrapper[4991]: I1006 08:19:18.550076 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:18 crc kubenswrapper[4991]: I1006 08:19:18.550099 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:19 crc kubenswrapper[4991]: E1006 08:19:19.314852 4991 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 08:19:23 crc kubenswrapper[4991]: W1006 08:19:23.371410 4991 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 06 08:19:23 crc kubenswrapper[4991]: I1006 08:19:23.371590 4991 trace.go:236] Trace[580254294]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 08:19:13.370) (total time: 10001ms): Oct 06 08:19:23 crc kubenswrapper[4991]: Trace[580254294]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (08:19:23.371) Oct 06 08:19:23 crc kubenswrapper[4991]: Trace[580254294]: [10.00128471s] [10.00128471s] END Oct 06 08:19:23 crc kubenswrapper[4991]: E1006 08:19:23.371623 4991 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 06 08:19:23 crc kubenswrapper[4991]: I1006 08:19:23.568132 4991 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 06 08:19:23 crc kubenswrapper[4991]: I1006 08:19:23.568248 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 06 08:19:23 crc kubenswrapper[4991]: I1006 08:19:23.580368 4991 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 06 08:19:23 crc kubenswrapper[4991]: I1006 08:19:23.580454 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 06 08:19:23 crc kubenswrapper[4991]: I1006 08:19:23.590364 4991 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 06 08:19:23 crc kubenswrapper[4991]: I1006 08:19:23.590443 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 06 08:19:23 crc kubenswrapper[4991]: I1006 08:19:23.807289 4991 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 06 08:19:23 crc kubenswrapper[4991]: I1006 08:19:23.807398 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.019476 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.019788 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.021569 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.021630 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.021658 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.394193 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.394571 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.394969 4991 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.395061 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.396609 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.396707 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.396732 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.399326 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.429869 4991 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 08:19:27 crc kubenswrapper[4991]: I1006 08:19:27.429936 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.039492 4991 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.382085 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.382717 4991 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.382805 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.383570 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.383618 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.383634 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:28 crc kubenswrapper[4991]: E1006 08:19:28.574625 4991 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.577051 4991 trace.go:236] Trace[363258187]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 08:19:17.288) (total time: 11288ms): Oct 06 08:19:28 crc kubenswrapper[4991]: Trace[363258187]: ---"Objects listed" error: 11288ms (08:19:28.576) Oct 06 08:19:28 crc kubenswrapper[4991]: Trace[363258187]: [11.288430695s] [11.288430695s] END Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.577611 4991 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.579528 4991 trace.go:236] Trace[761810140]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 08:19:16.712) (total time: 11867ms): Oct 06 08:19:28 crc kubenswrapper[4991]: Trace[761810140]: ---"Objects listed" error: 11867ms (08:19:28.579) Oct 06 08:19:28 crc kubenswrapper[4991]: Trace[761810140]: [11.867258535s] [11.867258535s] END Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.579567 4991 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 06 08:19:28 crc kubenswrapper[4991]: E1006 08:19:28.581369 4991 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.581938 4991 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.582158 4991 trace.go:236] Trace[1374676874]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 08:19:18.536) (total time: 10045ms): Oct 06 08:19:28 crc kubenswrapper[4991]: Trace[1374676874]: ---"Objects listed" error: 10045ms (08:19:28.581) Oct 06 08:19:28 crc kubenswrapper[4991]: Trace[1374676874]: [10.045547041s] [10.045547041s] END Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.582201 4991 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.585398 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 06 08:19:28 crc kubenswrapper[4991]: I1006 08:19:28.599983 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.153611 4991 apiserver.go:52] "Watching apiserver" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.158037 4991 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.158612 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.159234 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.159242 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.159480 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.160720 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.160992 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.161372 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.162227 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.162377 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.162435 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.164270 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.170447 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.170758 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.170836 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.170899 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.170944 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.171506 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.171865 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.172396 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.172554 4991 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.185224 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.185345 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.185399 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.185616 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.185668 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.185726 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.185785 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.185826 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.185875 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.185925 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.185967 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.186087 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.186144 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.186188 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.186413 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.186523 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.186623 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.186688 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.186749 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.185955 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.186805 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.186930 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.186981 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187020 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187051 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187078 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187106 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187132 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187157 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187179 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187203 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187229 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187264 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187328 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187363 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187388 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187418 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187444 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187473 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187508 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187545 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187581 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187616 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187645 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187686 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187716 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187751 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187785 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187824 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187856 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187889 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187921 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187952 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187997 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188033 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188073 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188109 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188158 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188197 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188231 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188262 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188292 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188418 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188465 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188510 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188562 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188603 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188642 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188686 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188721 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188766 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188804 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188898 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188936 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188970 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189011 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189063 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189106 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189150 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189192 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189234 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189277 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189454 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189504 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189542 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189583 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189618 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189655 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189698 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189746 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189791 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189835 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189874 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189937 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189978 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190011 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190047 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190093 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190125 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190158 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190197 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190238 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190310 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190358 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190396 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190430 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190470 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190509 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190551 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190588 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190628 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190660 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190683 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190714 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190739 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190763 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190789 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190816 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190847 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190932 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190971 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191002 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191030 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191055 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191083 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191111 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191137 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191162 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191187 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191216 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191240 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191320 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191357 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191383 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191407 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191435 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191468 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191502 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191545 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191580 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191606 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191630 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191656 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191683 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191709 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191743 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191769 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191792 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191817 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191842 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191864 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191888 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191912 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191937 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191960 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.191983 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192007 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192032 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192058 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192083 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192109 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192138 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192164 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192188 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192213 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192237 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192262 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192289 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192347 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192379 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192404 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192440 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192477 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192512 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192546 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192585 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192622 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192656 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192681 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192706 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192836 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192870 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192901 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192929 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192953 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192983 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193007 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193034 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193058 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193085 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193111 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193136 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193161 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193187 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193213 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193282 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193625 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193708 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193750 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193836 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193911 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193985 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.194033 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.194104 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.194259 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.194460 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.194512 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.194655 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.194818 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.195347 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.196533 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187384 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187474 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187699 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187742 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.187977 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188431 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188473 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188515 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.188669 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189332 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189669 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189712 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.189767 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190167 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190244 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.190920 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.192110 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.193948 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.194146 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.196096 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.196205 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.196512 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.196687 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.197315 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.200572 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.201067 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.201136 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.201446 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.201791 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.201829 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.201855 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.201934 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.201969 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.201983 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.202497 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.202550 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.203035 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.203185 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.203280 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.203387 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.203763 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.208058 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.208368 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.208837 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.208961 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.209278 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.209423 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.209658 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.209667 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.209771 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.210041 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.210115 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.210525 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.210853 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.211044 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.211978 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.212366 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.212472 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.212740 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.212829 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.212860 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.212961 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.213076 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.213338 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.213444 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.213586 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.213711 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.213919 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.214048 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.214020 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.214179 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.214053 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.214229 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.214273 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:29.714243019 +0000 UTC m=+21.451993250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.214653 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.214925 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.215448 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.215779 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.216194 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.216364 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.216551 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.216703 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.218245 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.218251 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.218517 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.218710 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.219369 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.219405 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.220232 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.220355 4991 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.220502 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.220649 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.220679 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.221699 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.221783 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.223452 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.223864 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.224908 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.227249 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.228141 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.228311 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:29.728272437 +0000 UTC m=+21.466022458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.229027 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.231006 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.232035 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.232519 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.232617 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.237340 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.238459 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.238646 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.238670 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.238869 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.239395 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.239951 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.240148 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.240590 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.241011 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.242225 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.242447 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.242943 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.243015 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:19:29.742853931 +0000 UTC m=+21.480603952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.243763 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.243847 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.244283 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.244284 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.244908 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.245206 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.245690 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.245961 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.246258 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.252337 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.252564 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.252582 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.252687 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:29.7526623 +0000 UTC m=+21.490412321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.268860 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.268725 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.269250 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.269265 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.269896 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.271212 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.280766 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.280820 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.280840 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.280939 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:29.780912363 +0000 UTC m=+21.518662384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.281385 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.288578 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.288804 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.288940 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.289107 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.289314 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.289429 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.289753 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.289914 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.289512 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.290168 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.290227 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.290668 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.290684 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.290804 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.290900 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.291139 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.291683 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.291854 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.291882 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.292006 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.292133 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.293561 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.293714 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.294278 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.294622 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.294721 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.294790 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.294811 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.294740 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.295040 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.295043 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.295062 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.295845 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.295856 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.296042 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.296177 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.296336 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.295883 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.296501 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.297069 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.297115 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.297589 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.297646 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.297742 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.297849 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.297875 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.297631 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298001 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298005 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298021 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298090 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298089 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298173 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298599 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298623 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298636 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298835 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298883 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298953 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.298975 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299096 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299551 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299590 4991 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299594 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299614 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299631 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299645 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299568 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299726 4991 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299749 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299751 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299761 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299810 4991 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.299940 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300076 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300128 4991 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300180 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300200 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300214 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300226 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300239 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300266 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300234 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300281 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300325 4991 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300339 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300350 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300360 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300478 4991 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300498 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300512 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300526 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300542 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300557 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300570 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300582 4991 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300594 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300607 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300620 4991 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300633 4991 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300645 4991 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300658 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300672 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300687 4991 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300701 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300712 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300726 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300740 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300754 4991 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300766 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300779 4991 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300792 4991 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300804 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300816 4991 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300828 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.300839 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301012 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301083 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301092 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301332 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301353 4991 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301366 4991 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301379 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301408 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301468 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301507 4991 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301523 4991 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301538 4991 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301553 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301565 4991 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301577 4991 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301590 4991 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301603 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301619 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301632 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301644 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.301657 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302125 4991 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302152 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302166 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302180 4991 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302195 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302208 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302224 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302239 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302253 4991 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302265 4991 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302279 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302313 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302327 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302339 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302352 4991 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302364 4991 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302379 4991 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302394 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302407 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302421 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302433 4991 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302446 4991 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302461 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302474 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302488 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302679 4991 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302723 4991 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302737 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302750 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302776 4991 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302790 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302803 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302837 4991 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302854 4991 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302867 4991 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302881 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302894 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302907 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302919 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302932 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302943 4991 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302955 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302956 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302974 4991 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.302987 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303000 4991 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303013 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303026 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303043 4991 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303056 4991 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303070 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303084 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303090 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303096 4991 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303138 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303151 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303163 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303174 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303184 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303195 4991 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303206 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303217 4991 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303227 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303236 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303246 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303256 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303265 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303302 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303313 4991 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303322 4991 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303331 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303340 4991 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303349 4991 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303359 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303368 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303371 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303378 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303417 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303432 4991 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303435 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303607 4991 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303627 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303641 4991 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303654 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303676 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303689 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303698 4991 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303709 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303718 4991 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303728 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303738 4991 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303749 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303758 4991 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303767 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303776 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303784 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303793 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303802 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303810 4991 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303819 4991 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303827 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303836 4991 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303844 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303853 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.303862 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.304631 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.305669 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.307219 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.308185 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.311174 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.311797 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.312488 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.313664 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.315399 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.317693 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.319288 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.321503 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.322048 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.323613 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.324187 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.324652 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.325841 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.328407 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.330374 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.335545 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.336283 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.336972 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.337548 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.338174 4991 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.338351 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.339220 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.341464 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.342229 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.342608 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.343062 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.344818 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.345889 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.346478 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.347594 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.348281 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.349325 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.349937 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.350896 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.351500 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.352391 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.352907 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.353812 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.354094 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.354564 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.355649 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.356119 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.356972 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.357521 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.358105 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.358994 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.363574 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.380487 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.386884 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.387523 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.390043 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1" exitCode=255 Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.390142 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1"} Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.390250 4991 scope.go:117] "RemoveContainer" containerID="1d8e1e9f243640c0da720d74f5350f5c761505efc4d08baeea028d84d3503f5e" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.394340 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.398891 4991 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.401465 4991 scope.go:117] "RemoveContainer" containerID="9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1" Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.401703 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.401765 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.404896 4991 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.404925 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.404942 4991 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.404955 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.404971 4991 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.404986 4991 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405000 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405013 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405026 4991 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405041 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405056 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405069 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405084 4991 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405098 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405110 4991 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405123 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405137 4991 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405151 4991 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405116 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405164 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405283 4991 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405324 4991 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405340 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405353 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405368 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405392 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405406 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405420 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.405433 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.417569 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.440160 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.451494 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.463942 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.473704 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.487330 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.487454 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d8e1e9f243640c0da720d74f5350f5c761505efc4d08baeea028d84d3503f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"message\\\":\\\"W1006 08:19:12.451769 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:19:12.452142 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759738752 cert, and key in /tmp/serving-cert-3768995481/serving-signer.crt, /tmp/serving-cert-3768995481/serving-signer.key\\\\nI1006 08:19:12.840414 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:19:12.843536 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1006 08:19:12.843931 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:12.844831 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3768995481/tls.crt::/tmp/serving-cert-3768995481/tls.key\\\\\\\"\\\\nF1006 08:19:13.208749 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.499897 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: W1006 08:19:29.502845 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-bbe583f89e54591c56b50ce3bbb524360c64f62a41c17d3ea14859528b518d37 WatchSource:0}: Error finding container bbe583f89e54591c56b50ce3bbb524360c64f62a41c17d3ea14859528b518d37: Status 404 returned error can't find the container with id bbe583f89e54591c56b50ce3bbb524360c64f62a41c17d3ea14859528b518d37 Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.508076 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.510466 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.523988 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.530241 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 08:19:29 crc kubenswrapper[4991]: W1006 08:19:29.530520 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6284bfa9bb1ea2ef5d98f89e344514b750394c4e6bbefe62bceb5613b8b122b4 WatchSource:0}: Error finding container 6284bfa9bb1ea2ef5d98f89e344514b750394c4e6bbefe62bceb5613b8b122b4: Status 404 returned error can't find the container with id 6284bfa9bb1ea2ef5d98f89e344514b750394c4e6bbefe62bceb5613b8b122b4 Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.541791 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: W1006 08:19:29.552634 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-8b252b9092b426cdb722d3cb235513ba31f4c00a8781fc605cc465f318ce8bb9 WatchSource:0}: Error finding container 8b252b9092b426cdb722d3cb235513ba31f4c00a8781fc605cc465f318ce8bb9: Status 404 returned error can't find the container with id 8b252b9092b426cdb722d3cb235513ba31f4c00a8781fc605cc465f318ce8bb9 Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.563440 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.578788 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.595522 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.810137 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.810227 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.810259 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.810280 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:29 crc kubenswrapper[4991]: I1006 08:19:29.810325 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.810474 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.810493 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.810506 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.810513 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.810557 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:30.810543216 +0000 UTC m=+22.548293237 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.810625 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.810649 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:19:30.810588997 +0000 UTC m=+22.548339058 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.810691 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.810764 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.810795 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.810718 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:30.81069935 +0000 UTC m=+22.548449411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.810953 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:30.810909776 +0000 UTC m=+22.548659957 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:19:29 crc kubenswrapper[4991]: E1006 08:19:29.811010 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:30.810990398 +0000 UTC m=+22.548740619 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.394982 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c"} Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.395682 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bbe583f89e54591c56b50ce3bbb524360c64f62a41c17d3ea14859528b518d37"} Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.398659 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.402353 4991 scope.go:117] "RemoveContainer" containerID="9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1" Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.402563 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.403195 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8b252b9092b426cdb722d3cb235513ba31f4c00a8781fc605cc465f318ce8bb9"} Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.405572 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2"} Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.405614 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8"} Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.405629 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6284bfa9bb1ea2ef5d98f89e344514b750394c4e6bbefe62bceb5613b8b122b4"} Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.418105 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.445629 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.460140 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.474691 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.492637 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d8e1e9f243640c0da720d74f5350f5c761505efc4d08baeea028d84d3503f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"message\\\":\\\"W1006 08:19:12.451769 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 08:19:12.452142 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759738752 cert, and key in /tmp/serving-cert-3768995481/serving-signer.crt, /tmp/serving-cert-3768995481/serving-signer.key\\\\nI1006 08:19:12.840414 1 observer_polling.go:159] Starting file observer\\\\nW1006 08:19:12.843536 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1006 08:19:12.843931 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:12.844831 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3768995481/tls.crt::/tmp/serving-cert-3768995481/tls.key\\\\\\\"\\\\nF1006 08:19:13.208749 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.515925 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.540207 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.558916 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.581038 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.606102 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.622221 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.643913 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.665960 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.685888 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.705424 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.718649 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.819695 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.819775 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.819810 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.819833 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:30 crc kubenswrapper[4991]: I1006 08:19:30.819857 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.819938 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.819977 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:19:32.819933055 +0000 UTC m=+24.557683106 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.820022 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:32.820004407 +0000 UTC m=+24.557754458 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.820047 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.820098 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.820116 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.820047 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.820193 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:32.820167542 +0000 UTC m=+24.557917723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.820225 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:32.820213763 +0000 UTC m=+24.557963974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.820166 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.820258 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.820270 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:30 crc kubenswrapper[4991]: E1006 08:19:30.820320 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:32.820310526 +0000 UTC m=+24.558060747 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:31 crc kubenswrapper[4991]: I1006 08:19:31.242969 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:31 crc kubenswrapper[4991]: I1006 08:19:31.243073 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:31 crc kubenswrapper[4991]: E1006 08:19:31.243193 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:31 crc kubenswrapper[4991]: I1006 08:19:31.243242 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:31 crc kubenswrapper[4991]: E1006 08:19:31.243489 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:31 crc kubenswrapper[4991]: E1006 08:19:31.243704 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:31 crc kubenswrapper[4991]: I1006 08:19:31.251371 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 06 08:19:31 crc kubenswrapper[4991]: I1006 08:19:31.252858 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 06 08:19:31 crc kubenswrapper[4991]: I1006 08:19:31.255443 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 06 08:19:31 crc kubenswrapper[4991]: I1006 08:19:31.256841 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 06 08:19:31 crc kubenswrapper[4991]: I1006 08:19:31.258791 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 06 08:19:31 crc kubenswrapper[4991]: I1006 08:19:31.260605 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 06 08:19:31 crc kubenswrapper[4991]: I1006 08:19:31.262427 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 06 08:19:31 crc kubenswrapper[4991]: I1006 08:19:31.264684 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 06 08:19:31 crc kubenswrapper[4991]: I1006 08:19:31.266070 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 06 08:19:32 crc kubenswrapper[4991]: I1006 08:19:32.840033 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.840364 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:19:36.840282149 +0000 UTC m=+28.578032180 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:19:32 crc kubenswrapper[4991]: I1006 08:19:32.840951 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:32 crc kubenswrapper[4991]: I1006 08:19:32.841021 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:32 crc kubenswrapper[4991]: I1006 08:19:32.841076 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:32 crc kubenswrapper[4991]: I1006 08:19:32.841131 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.841343 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.841362 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.841415 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.841435 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:36.841418642 +0000 UTC m=+28.579168673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.841457 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.841487 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.841523 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:36.841483143 +0000 UTC m=+28.579233374 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.841664 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:36.841640658 +0000 UTC m=+28.579390899 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.841975 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.842012 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.842043 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:32 crc kubenswrapper[4991]: E1006 08:19:32.842114 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:36.842092391 +0000 UTC m=+28.579842672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.243108 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.243203 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.243101 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:33 crc kubenswrapper[4991]: E1006 08:19:33.243408 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:33 crc kubenswrapper[4991]: E1006 08:19:33.243555 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:33 crc kubenswrapper[4991]: E1006 08:19:33.243726 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.415409 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d"} Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.436133 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.449088 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.468309 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.485082 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.501465 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.519670 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.538234 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.551964 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.751082 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-scqml"] Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.751488 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-scqml" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.755524 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.755660 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.755712 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.778763 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.794470 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.806370 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.807065 4991 scope.go:117] "RemoveContainer" containerID="9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1" Oct 06 08:19:33 crc kubenswrapper[4991]: E1006 08:19:33.807237 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.808589 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.822801 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.837816 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.851772 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c92a7298-0ed4-4956-98d8-8eb78df3f1e3-hosts-file\") pod \"node-resolver-scqml\" (UID: \"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\") " pod="openshift-dns/node-resolver-scqml" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.851841 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4kzd\" (UniqueName: \"kubernetes.io/projected/c92a7298-0ed4-4956-98d8-8eb78df3f1e3-kube-api-access-h4kzd\") pod \"node-resolver-scqml\" (UID: \"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\") " pod="openshift-dns/node-resolver-scqml" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.866343 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.891713 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.919057 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.942744 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:33Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.953052 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c92a7298-0ed4-4956-98d8-8eb78df3f1e3-hosts-file\") pod \"node-resolver-scqml\" (UID: \"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\") " pod="openshift-dns/node-resolver-scqml" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.953117 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4kzd\" (UniqueName: \"kubernetes.io/projected/c92a7298-0ed4-4956-98d8-8eb78df3f1e3-kube-api-access-h4kzd\") pod \"node-resolver-scqml\" (UID: \"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\") " pod="openshift-dns/node-resolver-scqml" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.953237 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c92a7298-0ed4-4956-98d8-8eb78df3f1e3-hosts-file\") pod \"node-resolver-scqml\" (UID: \"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\") " pod="openshift-dns/node-resolver-scqml" Oct 06 08:19:33 crc kubenswrapper[4991]: I1006 08:19:33.971464 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4kzd\" (UniqueName: \"kubernetes.io/projected/c92a7298-0ed4-4956-98d8-8eb78df3f1e3-kube-api-access-h4kzd\") pod \"node-resolver-scqml\" (UID: \"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\") " pod="openshift-dns/node-resolver-scqml" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.067976 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-scqml" Oct 06 08:19:34 crc kubenswrapper[4991]: W1006 08:19:34.082652 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc92a7298_0ed4_4956_98d8_8eb78df3f1e3.slice/crio-89f58bb237665531f006da6822957259615d36c614f947923f961b247190d7f6 WatchSource:0}: Error finding container 89f58bb237665531f006da6822957259615d36c614f947923f961b247190d7f6: Status 404 returned error can't find the container with id 89f58bb237665531f006da6822957259615d36c614f947923f961b247190d7f6 Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.420751 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-scqml" event={"ID":"c92a7298-0ed4-4956-98d8-8eb78df3f1e3","Type":"ContainerStarted","Data":"546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a"} Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.420830 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-scqml" event={"ID":"c92a7298-0ed4-4956-98d8-8eb78df3f1e3","Type":"ContainerStarted","Data":"89f58bb237665531f006da6822957259615d36c614f947923f961b247190d7f6"} Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.433377 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.443241 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.448808 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.476451 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.477814 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.498522 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.525058 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.543199 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.574734 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-wpb6m"] Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.574845 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.575214 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xjvmw"] Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.575515 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-pgn9b"] Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.575630 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.575746 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.576694 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: W1006 08:19:34.579268 4991 reflector.go:561] object-"openshift-machine-config-operator"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 06 08:19:34 crc kubenswrapper[4991]: E1006 08:19:34.579360 4991 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:19:34 crc kubenswrapper[4991]: W1006 08:19:34.579268 4991 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 06 08:19:34 crc kubenswrapper[4991]: E1006 08:19:34.579409 4991 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:19:34 crc kubenswrapper[4991]: W1006 08:19:34.579402 4991 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 06 08:19:34 crc kubenswrapper[4991]: E1006 08:19:34.579442 4991 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:19:34 crc kubenswrapper[4991]: W1006 08:19:34.579629 4991 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 06 08:19:34 crc kubenswrapper[4991]: E1006 08:19:34.579653 4991 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:19:34 crc kubenswrapper[4991]: W1006 08:19:34.580996 4991 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 06 08:19:34 crc kubenswrapper[4991]: E1006 08:19:34.581029 4991 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:19:34 crc kubenswrapper[4991]: W1006 08:19:34.581664 4991 reflector.go:561] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": failed to list *v1.Secret: secrets "machine-config-daemon-dockercfg-r5tcq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 06 08:19:34 crc kubenswrapper[4991]: E1006 08:19:34.581701 4991 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-r5tcq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-config-daemon-dockercfg-r5tcq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.581747 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 06 08:19:34 crc kubenswrapper[4991]: W1006 08:19:34.581821 4991 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 06 08:19:34 crc kubenswrapper[4991]: E1006 08:19:34.581842 4991 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:19:34 crc kubenswrapper[4991]: W1006 08:19:34.581930 4991 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 06 08:19:34 crc kubenswrapper[4991]: W1006 08:19:34.581946 4991 reflector.go:561] object-"openshift-multus"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 06 08:19:34 crc kubenswrapper[4991]: E1006 08:19:34.581977 4991 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:19:34 crc kubenswrapper[4991]: E1006 08:19:34.581997 4991 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:19:34 crc kubenswrapper[4991]: W1006 08:19:34.581669 4991 reflector.go:561] object-"openshift-multus"/"multus-daemon-config": failed to list *v1.ConfigMap: configmaps "multus-daemon-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 06 08:19:34 crc kubenswrapper[4991]: E1006 08:19:34.582264 4991 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-daemon-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"multus-daemon-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.584711 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.600984 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.621608 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.641160 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.660049 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65471d7d-65b6-49ce-90be-171db9b3cb42-proxy-tls\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.660422 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7p92\" (UniqueName: \"kubernetes.io/projected/65471d7d-65b6-49ce-90be-171db9b3cb42-kube-api-access-g7p92\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.660588 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc2xh\" (UniqueName: \"kubernetes.io/projected/881045ce-f2cf-41d3-a315-eec70d0ed97d-kube-api-access-bc2xh\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.660760 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-run-multus-certs\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.660880 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/881045ce-f2cf-41d3-a315-eec70d0ed97d-system-cni-dir\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.660996 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzc78\" (UniqueName: \"kubernetes.io/projected/58386a1a-6047-42ce-a952-43f397822919-kube-api-access-xzc78\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.661103 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/881045ce-f2cf-41d3-a315-eec70d0ed97d-cni-binary-copy\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.661192 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-var-lib-cni-bin\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.661331 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/881045ce-f2cf-41d3-a315-eec70d0ed97d-cnibin\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.661438 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-cnibin\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.661516 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-run-netns\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.661588 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-var-lib-cni-multus\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.661667 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-system-cni-dir\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.661766 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-os-release\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.661862 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58386a1a-6047-42ce-a952-43f397822919-cni-binary-copy\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.661936 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-var-lib-kubelet\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.662021 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-multus-socket-dir-parent\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.661912 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.662104 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-multus-conf-dir\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.662343 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-run-k8s-cni-cncf-io\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.662384 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/65471d7d-65b6-49ce-90be-171db9b3cb42-rootfs\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.662410 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-etc-kubernetes\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.662435 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/881045ce-f2cf-41d3-a315-eec70d0ed97d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.662464 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-hostroot\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.662496 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-multus-cni-dir\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.662525 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65471d7d-65b6-49ce-90be-171db9b3cb42-mcd-auth-proxy-config\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.662555 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/881045ce-f2cf-41d3-a315-eec70d0ed97d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.662588 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/58386a1a-6047-42ce-a952-43f397822919-multus-daemon-config\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.662622 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/881045ce-f2cf-41d3-a315-eec70d0ed97d-os-release\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.686010 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.700717 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.714515 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.726624 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.742172 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.762050 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763328 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-multus-socket-dir-parent\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763378 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-multus-conf-dir\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763412 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-run-k8s-cni-cncf-io\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763439 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/65471d7d-65b6-49ce-90be-171db9b3cb42-rootfs\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763465 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/881045ce-f2cf-41d3-a315-eec70d0ed97d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763488 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-etc-kubernetes\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763503 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-run-k8s-cni-cncf-io\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763522 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-multus-conf-dir\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763566 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-etc-kubernetes\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763571 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-hostroot\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763514 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-hostroot\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763622 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/65471d7d-65b6-49ce-90be-171db9b3cb42-rootfs\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763644 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-multus-cni-dir\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763666 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65471d7d-65b6-49ce-90be-171db9b3cb42-mcd-auth-proxy-config\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763689 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/58386a1a-6047-42ce-a952-43f397822919-multus-daemon-config\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763706 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/881045ce-f2cf-41d3-a315-eec70d0ed97d-os-release\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763720 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/881045ce-f2cf-41d3-a315-eec70d0ed97d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763726 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-multus-socket-dir-parent\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763741 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65471d7d-65b6-49ce-90be-171db9b3cb42-proxy-tls\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763758 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7p92\" (UniqueName: \"kubernetes.io/projected/65471d7d-65b6-49ce-90be-171db9b3cb42-kube-api-access-g7p92\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763789 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc2xh\" (UniqueName: \"kubernetes.io/projected/881045ce-f2cf-41d3-a315-eec70d0ed97d-kube-api-access-bc2xh\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763846 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-run-multus-certs\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763865 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/881045ce-f2cf-41d3-a315-eec70d0ed97d-system-cni-dir\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763895 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzc78\" (UniqueName: \"kubernetes.io/projected/58386a1a-6047-42ce-a952-43f397822919-kube-api-access-xzc78\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763916 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/881045ce-f2cf-41d3-a315-eec70d0ed97d-cni-binary-copy\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763936 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/881045ce-f2cf-41d3-a315-eec70d0ed97d-cnibin\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763958 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-var-lib-cni-bin\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763977 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-cnibin\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.763999 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-run-netns\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764021 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-var-lib-cni-multus\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764042 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-system-cni-dir\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764042 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/881045ce-f2cf-41d3-a315-eec70d0ed97d-cnibin\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764062 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-os-release\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764068 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-multus-cni-dir\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764087 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-var-lib-kubelet\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764110 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58386a1a-6047-42ce-a952-43f397822919-cni-binary-copy\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764187 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-run-multus-certs\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764263 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-run-netns\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764275 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/881045ce-f2cf-41d3-a315-eec70d0ed97d-system-cni-dir\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764234 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-var-lib-kubelet\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764314 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-cnibin\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764287 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-os-release\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764328 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-var-lib-cni-multus\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764279 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/881045ce-f2cf-41d3-a315-eec70d0ed97d-os-release\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764338 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-host-var-lib-cni-bin\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764366 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58386a1a-6047-42ce-a952-43f397822919-system-cni-dir\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.764610 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/881045ce-f2cf-41d3-a315-eec70d0ed97d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.776037 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.792694 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.812058 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.828502 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.843445 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.857269 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.973261 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qwljw"] Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.974581 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.979502 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.979537 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.980963 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.980969 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.981512 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.981515 4991 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.982211 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.982286 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.983559 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.983611 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.983627 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.983771 4991 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.992980 4991 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.993344 4991 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.993611 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:34Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.994561 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.994599 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.994611 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.994630 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:34 crc kubenswrapper[4991]: I1006 08:19:34.994642 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:34Z","lastTransitionTime":"2025-10-06T08:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.012439 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.016107 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.020369 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.020436 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.020449 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.020478 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.020492 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.027023 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.031070 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.034879 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.034928 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.034942 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.034959 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.035009 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.046337 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.047709 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.049783 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.049833 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.049845 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.049865 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.049879 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.059770 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.062272 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.066243 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.066272 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.066282 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.066317 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.066331 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067045 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-openvswitch\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067116 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovn-node-metrics-cert\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067218 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-env-overrides\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067288 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmj9m\" (UniqueName: \"kubernetes.io/projected/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-kube-api-access-cmj9m\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067357 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-cni-bin\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067384 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovnkube-config\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067422 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-systemd-units\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067455 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-var-lib-openvswitch\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067483 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067534 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-slash\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067563 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-run-netns\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067625 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-ovn\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067690 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-kubelet\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067725 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-log-socket\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067758 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovnkube-script-lib\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067786 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-cni-netd\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067822 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067877 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-node-log\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067928 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-systemd\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.067960 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-etc-openvswitch\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.073029 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.079527 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.079722 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.081478 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.081510 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.081519 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.081536 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.081549 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.088564 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.102419 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.115125 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.127773 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.145812 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169012 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169462 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-kubelet\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169513 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-log-socket\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169541 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovnkube-script-lib\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169566 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-cni-netd\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169588 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169589 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-kubelet\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169619 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-node-log\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169648 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-systemd\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169659 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-cni-netd\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169669 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-etc-openvswitch\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169672 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-log-socket\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169706 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169713 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-openvswitch\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169765 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-node-log\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169777 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-openvswitch\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169825 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-systemd\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169838 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-etc-openvswitch\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169848 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovn-node-metrics-cert\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169905 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-env-overrides\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169924 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmj9m\" (UniqueName: \"kubernetes.io/projected/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-kube-api-access-cmj9m\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169947 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-cni-bin\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169966 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovnkube-config\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.169993 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-systemd-units\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170015 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-var-lib-openvswitch\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170032 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170062 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-slash\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170084 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-run-netns\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170104 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-ovn\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170161 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-ovn\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170188 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-cni-bin\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170368 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-run-ovn-kubernetes\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170412 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovnkube-script-lib\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170424 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-systemd-units\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170460 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-slash\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170465 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-var-lib-openvswitch\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170490 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-run-netns\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170788 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-env-overrides\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.170830 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovnkube-config\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.173603 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovn-node-metrics-cert\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.181803 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.183947 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.183983 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.183998 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.184019 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.184032 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.187725 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmj9m\" (UniqueName: \"kubernetes.io/projected/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-kube-api-access-cmj9m\") pod \"ovnkube-node-qwljw\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.194098 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:35Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.242885 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.242943 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.243040 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.242959 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.243125 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.243179 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.287343 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.287396 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.287408 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.287434 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.287449 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.288520 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:35 crc kubenswrapper[4991]: W1006 08:19:35.305836 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod977b0faa_5b3d_4e9d_bef4_ba47f8764c6e.slice/crio-0327300df1417f6fb788ab88272e82c63b85a47a4f13f7399314ae024c9a0093 WatchSource:0}: Error finding container 0327300df1417f6fb788ab88272e82c63b85a47a4f13f7399314ae024c9a0093: Status 404 returned error can't find the container with id 0327300df1417f6fb788ab88272e82c63b85a47a4f13f7399314ae024c9a0093 Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.390951 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.390991 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.391003 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.391020 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.391031 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.425338 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.425419 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"0327300df1417f6fb788ab88272e82c63b85a47a4f13f7399314ae024c9a0093"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.494227 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.494274 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.494285 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.494328 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.494341 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.496786 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.505330 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65471d7d-65b6-49ce-90be-171db9b3cb42-mcd-auth-proxy-config\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.553138 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.555324 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/881045ce-f2cf-41d3-a315-eec70d0ed97d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.573372 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.597529 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.597588 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.597600 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.597621 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.597638 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.647025 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.700419 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.700479 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.700493 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.700514 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.700531 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.764667 4991 secret.go:188] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.764738 4991 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.764801 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65471d7d-65b6-49ce-90be-171db9b3cb42-proxy-tls podName:65471d7d-65b6-49ce-90be-171db9b3cb42 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:36.264773873 +0000 UTC m=+28.002523904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/65471d7d-65b6-49ce-90be-171db9b3cb42-proxy-tls") pod "machine-config-daemon-wpb6m" (UID: "65471d7d-65b6-49ce-90be-171db9b3cb42") : failed to sync secret cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.764851 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/881045ce-f2cf-41d3-a315-eec70d0ed97d-cni-binary-copy podName:881045ce-f2cf-41d3-a315-eec70d0ed97d nodeName:}" failed. No retries permitted until 2025-10-06 08:19:36.264826464 +0000 UTC m=+28.002576485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/881045ce-f2cf-41d3-a315-eec70d0ed97d-cni-binary-copy") pod "multus-additional-cni-plugins-pgn9b" (UID: "881045ce-f2cf-41d3-a315-eec70d0ed97d") : failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.764738 4991 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.764890 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/58386a1a-6047-42ce-a952-43f397822919-cni-binary-copy podName:58386a1a-6047-42ce-a952-43f397822919 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:36.264883216 +0000 UTC m=+28.002633237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/58386a1a-6047-42ce-a952-43f397822919-cni-binary-copy") pod "multus-xjvmw" (UID: "58386a1a-6047-42ce-a952-43f397822919") : failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.764758 4991 configmap.go:193] Couldn't get configMap openshift-multus/multus-daemon-config: failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.764918 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/58386a1a-6047-42ce-a952-43f397822919-multus-daemon-config podName:58386a1a-6047-42ce-a952-43f397822919 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:36.264913737 +0000 UTC m=+28.002663758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "multus-daemon-config" (UniqueName: "kubernetes.io/configmap/58386a1a-6047-42ce-a952-43f397822919-multus-daemon-config") pod "multus-xjvmw" (UID: "58386a1a-6047-42ce-a952-43f397822919") : failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.778670 4991 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.778723 4991 projected.go:194] Error preparing data for projected volume kube-api-access-xzc78 for pod openshift-multus/multus-xjvmw: failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.778766 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58386a1a-6047-42ce-a952-43f397822919-kube-api-access-xzc78 podName:58386a1a-6047-42ce-a952-43f397822919 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:36.27875304 +0000 UTC m=+28.016503061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xzc78" (UniqueName: "kubernetes.io/projected/58386a1a-6047-42ce-a952-43f397822919-kube-api-access-xzc78") pod "multus-xjvmw" (UID: "58386a1a-6047-42ce-a952-43f397822919") : failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.778675 4991 projected.go:288] Couldn't get configMap openshift-machine-config-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.778806 4991 projected.go:194] Error preparing data for projected volume kube-api-access-g7p92 for pod openshift-machine-config-operator/machine-config-daemon-wpb6m: failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.778844 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65471d7d-65b6-49ce-90be-171db9b3cb42-kube-api-access-g7p92 podName:65471d7d-65b6-49ce-90be-171db9b3cb42 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:36.278831752 +0000 UTC m=+28.016581993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-g7p92" (UniqueName: "kubernetes.io/projected/65471d7d-65b6-49ce-90be-171db9b3cb42-kube-api-access-g7p92") pod "machine-config-daemon-wpb6m" (UID: "65471d7d-65b6-49ce-90be-171db9b3cb42") : failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.780897 4991 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.780945 4991 projected.go:194] Error preparing data for projected volume kube-api-access-bc2xh for pod openshift-multus/multus-additional-cni-plugins-pgn9b: failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: E1006 08:19:35.781007 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/881045ce-f2cf-41d3-a315-eec70d0ed97d-kube-api-access-bc2xh podName:881045ce-f2cf-41d3-a315-eec70d0ed97d nodeName:}" failed. No retries permitted until 2025-10-06 08:19:36.280988324 +0000 UTC m=+28.018738545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bc2xh" (UniqueName: "kubernetes.io/projected/881045ce-f2cf-41d3-a315-eec70d0ed97d-kube-api-access-bc2xh") pod "multus-additional-cni-plugins-pgn9b" (UID: "881045ce-f2cf-41d3-a315-eec70d0ed97d") : failed to sync configmap cache: timed out waiting for the condition Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.803576 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.803611 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.803621 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.803638 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.803649 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.804428 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.906428 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.906486 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.906498 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.906520 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.906539 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:35Z","lastTransitionTime":"2025-10-06T08:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.913339 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 06 08:19:35 crc kubenswrapper[4991]: I1006 08:19:35.978817 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.010523 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.010595 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.010609 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.010634 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.010652 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:36Z","lastTransitionTime":"2025-10-06T08:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.099864 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.110699 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.113509 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.113567 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.113579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.113600 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.113612 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:36Z","lastTransitionTime":"2025-10-06T08:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.129059 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.216266 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.216340 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.216354 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.216371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.216384 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:36Z","lastTransitionTime":"2025-10-06T08:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.282172 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/58386a1a-6047-42ce-a952-43f397822919-multus-daemon-config\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.282426 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65471d7d-65b6-49ce-90be-171db9b3cb42-proxy-tls\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.282450 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7p92\" (UniqueName: \"kubernetes.io/projected/65471d7d-65b6-49ce-90be-171db9b3cb42-kube-api-access-g7p92\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.282482 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc2xh\" (UniqueName: \"kubernetes.io/projected/881045ce-f2cf-41d3-a315-eec70d0ed97d-kube-api-access-bc2xh\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.282513 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzc78\" (UniqueName: \"kubernetes.io/projected/58386a1a-6047-42ce-a952-43f397822919-kube-api-access-xzc78\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.282534 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/881045ce-f2cf-41d3-a315-eec70d0ed97d-cni-binary-copy\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.282562 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58386a1a-6047-42ce-a952-43f397822919-cni-binary-copy\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.283511 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58386a1a-6047-42ce-a952-43f397822919-cni-binary-copy\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.283648 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/881045ce-f2cf-41d3-a315-eec70d0ed97d-cni-binary-copy\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.284581 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/58386a1a-6047-42ce-a952-43f397822919-multus-daemon-config\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.286656 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc2xh\" (UniqueName: \"kubernetes.io/projected/881045ce-f2cf-41d3-a315-eec70d0ed97d-kube-api-access-bc2xh\") pod \"multus-additional-cni-plugins-pgn9b\" (UID: \"881045ce-f2cf-41d3-a315-eec70d0ed97d\") " pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.286811 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzc78\" (UniqueName: \"kubernetes.io/projected/58386a1a-6047-42ce-a952-43f397822919-kube-api-access-xzc78\") pod \"multus-xjvmw\" (UID: \"58386a1a-6047-42ce-a952-43f397822919\") " pod="openshift-multus/multus-xjvmw" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.287931 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7p92\" (UniqueName: \"kubernetes.io/projected/65471d7d-65b6-49ce-90be-171db9b3cb42-kube-api-access-g7p92\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.288492 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65471d7d-65b6-49ce-90be-171db9b3cb42-proxy-tls\") pod \"machine-config-daemon-wpb6m\" (UID: \"65471d7d-65b6-49ce-90be-171db9b3cb42\") " pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.319261 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.319339 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.319354 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.319376 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.319389 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:36Z","lastTransitionTime":"2025-10-06T08:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.389241 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.396550 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xjvmw" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.402092 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.422357 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.422406 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.422419 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.422441 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.422454 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:36Z","lastTransitionTime":"2025-10-06T08:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:36 crc kubenswrapper[4991]: W1006 08:19:36.428269 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58386a1a_6047_42ce_a952_43f397822919.slice/crio-2365928ba7d33c93cb9d35b18a9cd7b86e5add2a58343dbb0fa7fac32c43d9e2 WatchSource:0}: Error finding container 2365928ba7d33c93cb9d35b18a9cd7b86e5add2a58343dbb0fa7fac32c43d9e2: Status 404 returned error can't find the container with id 2365928ba7d33c93cb9d35b18a9cd7b86e5add2a58343dbb0fa7fac32c43d9e2 Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.430529 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83" exitCode=0 Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.430629 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83"} Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.433258 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" event={"ID":"881045ce-f2cf-41d3-a315-eec70d0ed97d","Type":"ContainerStarted","Data":"132056cc4b39fcacb9a39eeda0481ffd392cd05921140fe20b593cba74eec7b0"} Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.434362 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"50a65ffc339e6ab040982930456e2d151520ff215ba78cd5a2691f703672f911"} Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.447952 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.463936 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.484132 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.505215 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.524915 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.531536 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.532400 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.532420 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.532461 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.532472 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:36Z","lastTransitionTime":"2025-10-06T08:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.551180 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.564750 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.587317 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.601143 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.617743 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.638972 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.639141 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.639457 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.639715 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.639804 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:36Z","lastTransitionTime":"2025-10-06T08:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.641547 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.654859 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.668380 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.686708 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.742831 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.742900 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.742916 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.742936 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.742949 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:36Z","lastTransitionTime":"2025-10-06T08:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.845997 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.846058 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.846072 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.846101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.846118 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:36Z","lastTransitionTime":"2025-10-06T08:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.866460 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bjjz6"] Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.866848 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bjjz6" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.869488 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.869518 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.871517 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.871523 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.882713 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.888911 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.889062 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889080 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:19:44.889052466 +0000 UTC m=+36.626802487 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.889122 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.889161 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.889207 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889278 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889340 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:44.889332164 +0000 UTC m=+36.627082185 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889339 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889362 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889400 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889422 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:44.889400726 +0000 UTC m=+36.627150767 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889423 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889476 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:44.889467428 +0000 UTC m=+36.627217469 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889471 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889529 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889549 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:36 crc kubenswrapper[4991]: E1006 08:19:36.889642 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:44.889615832 +0000 UTC m=+36.627365863 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.894383 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.910684 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.926500 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.942835 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.949995 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.950054 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.950067 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.950088 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.950110 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:36Z","lastTransitionTime":"2025-10-06T08:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.958835 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.977554 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.990215 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/270ca557-afe0-4918-b9b9-0beae133a293-serviceca\") pod \"node-ca-bjjz6\" (UID: \"270ca557-afe0-4918-b9b9-0beae133a293\") " pod="openshift-image-registry/node-ca-bjjz6" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.990262 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/270ca557-afe0-4918-b9b9-0beae133a293-host\") pod \"node-ca-bjjz6\" (UID: \"270ca557-afe0-4918-b9b9-0beae133a293\") " pod="openshift-image-registry/node-ca-bjjz6" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.990383 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4m5n\" (UniqueName: \"kubernetes.io/projected/270ca557-afe0-4918-b9b9-0beae133a293-kube-api-access-g4m5n\") pod \"node-ca-bjjz6\" (UID: \"270ca557-afe0-4918-b9b9-0beae133a293\") " pod="openshift-image-registry/node-ca-bjjz6" Oct 06 08:19:36 crc kubenswrapper[4991]: I1006 08:19:36.998324 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.012838 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.026571 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.045580 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.052699 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.052748 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.052758 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.052776 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.052787 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:37Z","lastTransitionTime":"2025-10-06T08:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.060059 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.075457 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.088367 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.091032 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/270ca557-afe0-4918-b9b9-0beae133a293-serviceca\") pod \"node-ca-bjjz6\" (UID: \"270ca557-afe0-4918-b9b9-0beae133a293\") " pod="openshift-image-registry/node-ca-bjjz6" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.091079 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/270ca557-afe0-4918-b9b9-0beae133a293-host\") pod \"node-ca-bjjz6\" (UID: \"270ca557-afe0-4918-b9b9-0beae133a293\") " pod="openshift-image-registry/node-ca-bjjz6" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.091132 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4m5n\" (UniqueName: \"kubernetes.io/projected/270ca557-afe0-4918-b9b9-0beae133a293-kube-api-access-g4m5n\") pod \"node-ca-bjjz6\" (UID: \"270ca557-afe0-4918-b9b9-0beae133a293\") " pod="openshift-image-registry/node-ca-bjjz6" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.091332 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/270ca557-afe0-4918-b9b9-0beae133a293-host\") pod \"node-ca-bjjz6\" (UID: \"270ca557-afe0-4918-b9b9-0beae133a293\") " pod="openshift-image-registry/node-ca-bjjz6" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.092092 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/270ca557-afe0-4918-b9b9-0beae133a293-serviceca\") pod \"node-ca-bjjz6\" (UID: \"270ca557-afe0-4918-b9b9-0beae133a293\") " pod="openshift-image-registry/node-ca-bjjz6" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.108518 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4m5n\" (UniqueName: \"kubernetes.io/projected/270ca557-afe0-4918-b9b9-0beae133a293-kube-api-access-g4m5n\") pod \"node-ca-bjjz6\" (UID: \"270ca557-afe0-4918-b9b9-0beae133a293\") " pod="openshift-image-registry/node-ca-bjjz6" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.111087 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.169959 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.170009 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.170018 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.170044 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.170057 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:37Z","lastTransitionTime":"2025-10-06T08:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.181938 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bjjz6" Oct 06 08:19:37 crc kubenswrapper[4991]: W1006 08:19:37.194271 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod270ca557_afe0_4918_b9b9_0beae133a293.slice/crio-6e6e240179919889573a1173b57c97d0cbef9f2e40d1faa7aad039c40843014c WatchSource:0}: Error finding container 6e6e240179919889573a1173b57c97d0cbef9f2e40d1faa7aad039c40843014c: Status 404 returned error can't find the container with id 6e6e240179919889573a1173b57c97d0cbef9f2e40d1faa7aad039c40843014c Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.242972 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.243003 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.243133 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:37 crc kubenswrapper[4991]: E1006 08:19:37.243247 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:37 crc kubenswrapper[4991]: E1006 08:19:37.243392 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:37 crc kubenswrapper[4991]: E1006 08:19:37.243608 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.272929 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.272983 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.272997 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.273019 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.273033 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:37Z","lastTransitionTime":"2025-10-06T08:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.383565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.383612 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.383630 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.383654 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.383668 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:37Z","lastTransitionTime":"2025-10-06T08:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.441611 4991 generic.go:334] "Generic (PLEG): container finished" podID="881045ce-f2cf-41d3-a315-eec70d0ed97d" containerID="1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe" exitCode=0 Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.441683 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" event={"ID":"881045ce-f2cf-41d3-a315-eec70d0ed97d","Type":"ContainerDied","Data":"1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.443285 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjvmw" event={"ID":"58386a1a-6047-42ce-a952-43f397822919","Type":"ContainerStarted","Data":"688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.443353 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjvmw" event={"ID":"58386a1a-6047-42ce-a952-43f397822919","Type":"ContainerStarted","Data":"2365928ba7d33c93cb9d35b18a9cd7b86e5add2a58343dbb0fa7fac32c43d9e2"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.446317 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.446385 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.450110 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bjjz6" event={"ID":"270ca557-afe0-4918-b9b9-0beae133a293","Type":"ContainerStarted","Data":"62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.450171 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bjjz6" event={"ID":"270ca557-afe0-4918-b9b9-0beae133a293","Type":"ContainerStarted","Data":"6e6e240179919889573a1173b57c97d0cbef9f2e40d1faa7aad039c40843014c"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.456841 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.456924 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.456941 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.456955 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.456970 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.456992 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.469939 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.487258 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.487565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.487617 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.487633 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.487655 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.487670 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:37Z","lastTransitionTime":"2025-10-06T08:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.502873 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.515830 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.531196 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.550983 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.566391 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.581873 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.593594 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.593904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.593953 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.593977 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.593993 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:37Z","lastTransitionTime":"2025-10-06T08:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.596348 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.607333 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.620652 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.642549 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.654857 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.668512 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.681615 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.696082 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.696977 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.697016 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.697025 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.697041 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.697051 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:37Z","lastTransitionTime":"2025-10-06T08:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.709052 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.722144 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.740033 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.752885 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.767961 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.782263 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.799573 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.799629 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.799640 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.799661 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.799676 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:37Z","lastTransitionTime":"2025-10-06T08:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.821535 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.860251 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.889267 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.902814 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.902860 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.902870 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.902886 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.902897 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:37Z","lastTransitionTime":"2025-10-06T08:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.904898 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.921766 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.946478 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.962868 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:37 crc kubenswrapper[4991]: I1006 08:19:37.979163 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:37Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.004982 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.005013 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.005021 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.005037 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.005048 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:38Z","lastTransitionTime":"2025-10-06T08:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.108232 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.108344 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.108371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.108411 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.108436 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:38Z","lastTransitionTime":"2025-10-06T08:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.211970 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.212044 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.212062 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.212088 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.212107 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:38Z","lastTransitionTime":"2025-10-06T08:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.315350 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.315402 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.315413 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.315448 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.315465 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:38Z","lastTransitionTime":"2025-10-06T08:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.418373 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.419344 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.419358 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.419382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.419397 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:38Z","lastTransitionTime":"2025-10-06T08:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.465472 4991 generic.go:334] "Generic (PLEG): container finished" podID="881045ce-f2cf-41d3-a315-eec70d0ed97d" containerID="fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9" exitCode=0 Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.465652 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" event={"ID":"881045ce-f2cf-41d3-a315-eec70d0ed97d","Type":"ContainerDied","Data":"fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9"} Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.481353 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.503887 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.522620 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.522674 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.522686 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.522709 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.522721 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:38Z","lastTransitionTime":"2025-10-06T08:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.526935 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.545979 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.562678 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.574881 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.588902 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.606416 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.621063 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.626487 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.626562 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.626579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.626606 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.626621 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:38Z","lastTransitionTime":"2025-10-06T08:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.637419 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.654072 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.670780 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.683960 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.700811 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.723498 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:38Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.729561 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.729608 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.729619 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.729642 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.729660 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:38Z","lastTransitionTime":"2025-10-06T08:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.832617 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.832666 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.832700 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.832719 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.832730 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:38Z","lastTransitionTime":"2025-10-06T08:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.935834 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.935907 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.935925 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.935954 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:38 crc kubenswrapper[4991]: I1006 08:19:38.935974 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:38Z","lastTransitionTime":"2025-10-06T08:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.039916 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.040010 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.040034 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.040069 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.040104 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:39Z","lastTransitionTime":"2025-10-06T08:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.143948 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.144003 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.144014 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.144036 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.144049 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:39Z","lastTransitionTime":"2025-10-06T08:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.243729 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.243745 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:39 crc kubenswrapper[4991]: E1006 08:19:39.244001 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.243902 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:39 crc kubenswrapper[4991]: E1006 08:19:39.244134 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:39 crc kubenswrapper[4991]: E1006 08:19:39.244387 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.246745 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.246827 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.246838 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.246856 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.246869 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:39Z","lastTransitionTime":"2025-10-06T08:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.269565 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.288369 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.303655 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.322061 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.341738 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.349339 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.349396 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.349411 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.349436 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.349458 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:39Z","lastTransitionTime":"2025-10-06T08:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.359573 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.376444 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.392248 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.416248 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.432932 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.448852 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.452185 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.452234 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.452270 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.452324 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.452345 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:39Z","lastTransitionTime":"2025-10-06T08:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.463484 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.477805 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119"} Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.481172 4991 generic.go:334] "Generic (PLEG): container finished" podID="881045ce-f2cf-41d3-a315-eec70d0ed97d" containerID="34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22" exitCode=0 Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.481249 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" event={"ID":"881045ce-f2cf-41d3-a315-eec70d0ed97d","Type":"ContainerDied","Data":"34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22"} Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.482528 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.497713 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.515114 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.528486 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.547005 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.555180 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.555224 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.555242 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.555265 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.555279 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:39Z","lastTransitionTime":"2025-10-06T08:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.564025 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.585067 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.598676 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.613064 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.626684 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.642309 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.658041 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.658116 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.658133 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.658162 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.658179 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:39Z","lastTransitionTime":"2025-10-06T08:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.662238 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.673998 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.688908 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.708124 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.730007 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.742377 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.758364 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.762760 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.762805 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.762823 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.762844 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.762858 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:39Z","lastTransitionTime":"2025-10-06T08:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.867278 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.867344 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.867353 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.867373 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.867385 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:39Z","lastTransitionTime":"2025-10-06T08:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.969964 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.970005 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.970015 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.970033 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:39 crc kubenswrapper[4991]: I1006 08:19:39.970043 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:39Z","lastTransitionTime":"2025-10-06T08:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.073284 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.073361 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.073375 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.073401 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.073418 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:40Z","lastTransitionTime":"2025-10-06T08:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.175802 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.175840 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.175853 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.175872 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.175885 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:40Z","lastTransitionTime":"2025-10-06T08:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.279356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.279404 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.279421 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.279445 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.279463 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:40Z","lastTransitionTime":"2025-10-06T08:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.382444 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.382499 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.382512 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.382535 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.382550 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:40Z","lastTransitionTime":"2025-10-06T08:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.486583 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.486760 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.486788 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.486862 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.486897 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:40Z","lastTransitionTime":"2025-10-06T08:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.494114 4991 generic.go:334] "Generic (PLEG): container finished" podID="881045ce-f2cf-41d3-a315-eec70d0ed97d" containerID="ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81" exitCode=0 Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.494170 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" event={"ID":"881045ce-f2cf-41d3-a315-eec70d0ed97d","Type":"ContainerDied","Data":"ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81"} Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.508543 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.531233 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.554462 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.569720 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.584318 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.590067 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.590138 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.590148 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.590168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.590182 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:40Z","lastTransitionTime":"2025-10-06T08:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.595370 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.610353 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.625894 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.646101 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.660190 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.678033 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.691647 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.692371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.692539 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.692620 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.692715 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.692776 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:40Z","lastTransitionTime":"2025-10-06T08:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.705867 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.718538 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.732146 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:40Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.795791 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.795851 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.795863 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.795887 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.795900 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:40Z","lastTransitionTime":"2025-10-06T08:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.899209 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.899662 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.899751 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.899842 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:40 crc kubenswrapper[4991]: I1006 08:19:40.899906 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:40Z","lastTransitionTime":"2025-10-06T08:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.002513 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.002554 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.002565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.002581 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.002590 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:41Z","lastTransitionTime":"2025-10-06T08:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.105780 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.105847 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.105862 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.105884 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.105898 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:41Z","lastTransitionTime":"2025-10-06T08:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.208246 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.208315 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.208331 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.208356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.208369 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:41Z","lastTransitionTime":"2025-10-06T08:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.243554 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.243554 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:41 crc kubenswrapper[4991]: E1006 08:19:41.243706 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.243753 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:41 crc kubenswrapper[4991]: E1006 08:19:41.243914 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:41 crc kubenswrapper[4991]: E1006 08:19:41.243771 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.310878 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.311265 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.311502 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.311753 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.311939 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:41Z","lastTransitionTime":"2025-10-06T08:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.415771 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.416223 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.416238 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.416265 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.416278 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:41Z","lastTransitionTime":"2025-10-06T08:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.502314 4991 generic.go:334] "Generic (PLEG): container finished" podID="881045ce-f2cf-41d3-a315-eec70d0ed97d" containerID="ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705" exitCode=0 Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.502410 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" event={"ID":"881045ce-f2cf-41d3-a315-eec70d0ed97d","Type":"ContainerDied","Data":"ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.511680 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.512056 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.512121 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.518919 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.518961 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.518970 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.518988 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.519000 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:41Z","lastTransitionTime":"2025-10-06T08:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.522406 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.538314 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.539709 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.540173 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.549904 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.561086 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.575998 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.590833 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.602954 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.617711 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.627553 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.627616 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.627633 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.627655 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.627672 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:41Z","lastTransitionTime":"2025-10-06T08:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.641157 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.653259 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.669093 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.691054 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.704491 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.720931 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.730001 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.730031 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.730040 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.730056 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.730066 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:41Z","lastTransitionTime":"2025-10-06T08:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.734003 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.748575 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.766852 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.779722 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.794024 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.823731 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.833024 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.833056 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.833066 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.833081 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.833090 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:41Z","lastTransitionTime":"2025-10-06T08:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.837749 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.853409 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.874550 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.892484 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.909547 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.921471 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.935450 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.935517 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.935538 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.935564 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.935583 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:41Z","lastTransitionTime":"2025-10-06T08:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.939736 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.954585 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.971607 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:41 crc kubenswrapper[4991]: I1006 08:19:41.988804 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:41Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.038833 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.038889 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.038901 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.038926 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.038945 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:42Z","lastTransitionTime":"2025-10-06T08:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.142479 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.142530 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.142540 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.142561 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.142571 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:42Z","lastTransitionTime":"2025-10-06T08:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.245804 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.245879 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.245908 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.245943 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.245977 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:42Z","lastTransitionTime":"2025-10-06T08:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.349371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.349420 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.349439 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.349461 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.349474 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:42Z","lastTransitionTime":"2025-10-06T08:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.452427 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.452535 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.452555 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.452588 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.452608 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:42Z","lastTransitionTime":"2025-10-06T08:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.520740 4991 generic.go:334] "Generic (PLEG): container finished" podID="881045ce-f2cf-41d3-a315-eec70d0ed97d" containerID="5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41" exitCode=0 Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.520865 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" event={"ID":"881045ce-f2cf-41d3-a315-eec70d0ed97d","Type":"ContainerDied","Data":"5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41"} Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.520959 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.545041 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.560998 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.561070 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.561091 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.561120 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.561148 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:42Z","lastTransitionTime":"2025-10-06T08:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.564767 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.597818 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.611504 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.625637 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.640387 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.656417 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.664476 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.664522 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.664533 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.664554 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.664567 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:42Z","lastTransitionTime":"2025-10-06T08:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.670923 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.681637 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.694140 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.723913 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.746541 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.769954 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.769954 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.770007 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.770022 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.770042 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.770059 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:42Z","lastTransitionTime":"2025-10-06T08:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.784952 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.803119 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:42Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.873267 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.873332 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.873344 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.873369 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.873383 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:42Z","lastTransitionTime":"2025-10-06T08:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.976768 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.976815 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.976825 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.976844 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:42 crc kubenswrapper[4991]: I1006 08:19:42.976856 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:42Z","lastTransitionTime":"2025-10-06T08:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.080522 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.080589 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.080606 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.080632 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.080649 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:43Z","lastTransitionTime":"2025-10-06T08:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.184768 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.184850 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.184868 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.184896 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.184919 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:43Z","lastTransitionTime":"2025-10-06T08:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.243832 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.243924 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:43 crc kubenswrapper[4991]: E1006 08:19:43.244044 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.244124 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:43 crc kubenswrapper[4991]: E1006 08:19:43.244432 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:43 crc kubenswrapper[4991]: E1006 08:19:43.244552 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.288148 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.288217 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.288249 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.288284 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.288355 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:43Z","lastTransitionTime":"2025-10-06T08:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.391331 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.391382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.391400 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.391417 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.391427 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:43Z","lastTransitionTime":"2025-10-06T08:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.494778 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.494899 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.494908 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.494930 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.494943 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:43Z","lastTransitionTime":"2025-10-06T08:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.531919 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" event={"ID":"881045ce-f2cf-41d3-a315-eec70d0ed97d","Type":"ContainerStarted","Data":"c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202"} Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.532009 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.550970 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.565664 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.595257 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.597719 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.597808 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.597833 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.597871 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.597897 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:43Z","lastTransitionTime":"2025-10-06T08:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.618545 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.632531 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.654735 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.672892 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.692028 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.705778 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.705836 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.705857 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.705888 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.705911 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:43Z","lastTransitionTime":"2025-10-06T08:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.710155 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.729890 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.754776 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.777659 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.798490 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.808338 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.808429 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.808446 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.808472 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.808494 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:43Z","lastTransitionTime":"2025-10-06T08:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.815426 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.836258 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:43Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.912368 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.912462 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.912486 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.912516 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:43 crc kubenswrapper[4991]: I1006 08:19:43.912540 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:43Z","lastTransitionTime":"2025-10-06T08:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.015879 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.015933 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.015944 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.015964 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.015978 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:44Z","lastTransitionTime":"2025-10-06T08:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.119074 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.119138 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.119150 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.119169 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.119182 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:44Z","lastTransitionTime":"2025-10-06T08:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.222485 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.222543 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.222557 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.222579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.222593 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:44Z","lastTransitionTime":"2025-10-06T08:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.243625 4991 scope.go:117] "RemoveContainer" containerID="9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.325981 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.326022 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.326033 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.326055 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.326070 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:44Z","lastTransitionTime":"2025-10-06T08:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.429871 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.429930 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.429942 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.429964 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.429980 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:44Z","lastTransitionTime":"2025-10-06T08:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.532963 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.533454 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.533465 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.533482 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.533494 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:44Z","lastTransitionTime":"2025-10-06T08:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.537655 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.539770 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce"} Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.540720 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.559685 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.574755 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.588425 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.607321 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.625052 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.636634 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.636692 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.636709 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.636736 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.636757 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:44Z","lastTransitionTime":"2025-10-06T08:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.641570 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.656586 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.672829 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.707518 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.720520 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.735605 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.739617 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.739662 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.739676 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.739699 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.739712 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:44Z","lastTransitionTime":"2025-10-06T08:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.764854 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.781870 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.800512 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.812358 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:44Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.842486 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.842540 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.842549 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.842568 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.842580 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:44Z","lastTransitionTime":"2025-10-06T08:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.945743 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.945806 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.945820 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.945841 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.945858 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:44Z","lastTransitionTime":"2025-10-06T08:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.980508 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.980747 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.980771 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:20:00.980736375 +0000 UTC m=+52.718486396 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.980834 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.980881 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.980916 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:44 crc kubenswrapper[4991]: I1006 08:19:44.980952 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.981027 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:20:00.981013813 +0000 UTC m=+52.718763834 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.981091 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.981148 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.981114 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.981165 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.981178 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.981214 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:20:00.981207519 +0000 UTC m=+52.718957540 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.981151 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.981254 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.981254 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:20:00.981226349 +0000 UTC m=+52.718976380 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:19:44 crc kubenswrapper[4991]: E1006 08:19:44.981320 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:20:00.981289931 +0000 UTC m=+52.719039952 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.048617 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.048672 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.048686 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.048708 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.048722 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.151387 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.151451 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.151469 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.151494 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.151511 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.219056 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.219135 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.219159 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.219189 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.219214 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: E1006 08:19:45.238189 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.242875 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.242876 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.242875 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:45 crc kubenswrapper[4991]: E1006 08:19:45.243259 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:45 crc kubenswrapper[4991]: E1006 08:19:45.243480 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:45 crc kubenswrapper[4991]: E1006 08:19:45.243665 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.245659 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.245731 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.245778 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.245813 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.245838 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: E1006 08:19:45.268422 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.273754 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.273819 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.273844 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.273876 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.273901 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: E1006 08:19:45.296130 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.302049 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.302109 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.302169 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.302205 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.302230 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: E1006 08:19:45.325071 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.330733 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.330782 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.330795 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.330815 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.330830 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: E1006 08:19:45.348775 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: E1006 08:19:45.348959 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.351087 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.351141 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.351159 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.351182 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.351199 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.454079 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.454149 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.454165 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.454191 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.454210 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.545820 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/0.log" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.549083 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46" exitCode=1 Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.549176 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46"} Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.550854 4991 scope.go:117] "RemoveContainer" containerID="04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.557518 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.557583 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.557602 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.557650 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.557669 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.576529 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.589697 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.613241 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.679880 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.679972 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.680031 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.680042 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.680085 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.680100 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.694769 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.704861 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.719101 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.734674 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.751161 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.765583 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.781523 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.783353 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.783383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.783392 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.783405 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.783414 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.795483 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.812834 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.833642 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:44Z\\\",\\\"message\\\":\\\"I1006 08:19:44.604487 6453 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:44.604525 6453 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 08:19:44.604529 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 08:19:44.604535 6453 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:44.604544 6453 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:44.604554 6453 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:44.604562 6453 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:44.604859 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:19:44.604891 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:44.604897 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:44.604909 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:44.604929 6453 factory.go:656] Stopping watch factory\\\\nI1006 08:19:44.604935 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:19:44.604938 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:44.604945 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:44.604953 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.848107 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:45Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.886181 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.886224 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.886242 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.886267 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.886284 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.989519 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.989593 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.989611 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.989638 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:45 crc kubenswrapper[4991]: I1006 08:19:45.989655 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:45Z","lastTransitionTime":"2025-10-06T08:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.093373 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.093458 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.093475 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.093499 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.093511 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:46Z","lastTransitionTime":"2025-10-06T08:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.196855 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.196889 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.196899 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.196920 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.196933 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:46Z","lastTransitionTime":"2025-10-06T08:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.299570 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.299619 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.299629 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.299649 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.299663 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:46Z","lastTransitionTime":"2025-10-06T08:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.403028 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.403086 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.403099 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.403124 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.403139 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:46Z","lastTransitionTime":"2025-10-06T08:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.506515 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.506575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.506586 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.506609 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.506622 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:46Z","lastTransitionTime":"2025-10-06T08:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.556803 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/0.log" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.560453 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86"} Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.560615 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.581067 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.596071 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.610165 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.610220 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.610229 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.610250 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.610263 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:46Z","lastTransitionTime":"2025-10-06T08:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.612627 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.637235 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.659745 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:44Z\\\",\\\"message\\\":\\\"I1006 08:19:44.604487 6453 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:44.604525 6453 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 08:19:44.604529 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 08:19:44.604535 6453 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:44.604544 6453 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:44.604554 6453 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:44.604562 6453 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:44.604859 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:19:44.604891 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:44.604897 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:44.604909 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:44.604929 6453 factory.go:656] Stopping watch factory\\\\nI1006 08:19:44.604935 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:19:44.604938 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:44.604945 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:44.604953 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.680932 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.697561 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.709408 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.712620 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.712646 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.712654 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.712669 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.712681 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:46Z","lastTransitionTime":"2025-10-06T08:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.727745 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.743047 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.761400 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.784140 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.797781 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.811545 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.815498 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.815542 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.815554 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.815571 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.815582 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:46Z","lastTransitionTime":"2025-10-06T08:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.824167 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.918997 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.919032 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.919042 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.919058 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:46 crc kubenswrapper[4991]: I1006 08:19:46.919068 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:46Z","lastTransitionTime":"2025-10-06T08:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.022290 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.022376 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.022394 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.022419 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.022436 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:47Z","lastTransitionTime":"2025-10-06T08:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.125660 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.125746 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.125768 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.125796 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.125814 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:47Z","lastTransitionTime":"2025-10-06T08:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.229126 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.229170 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.229181 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.229201 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.229214 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:47Z","lastTransitionTime":"2025-10-06T08:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.243607 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.243668 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:47 crc kubenswrapper[4991]: E1006 08:19:47.243795 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.243813 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:47 crc kubenswrapper[4991]: E1006 08:19:47.243978 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:47 crc kubenswrapper[4991]: E1006 08:19:47.244083 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.331584 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.331633 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.331645 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.331666 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.331681 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:47Z","lastTransitionTime":"2025-10-06T08:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.384682 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85"] Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.385380 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.388506 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.389135 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.401861 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.423553 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.434565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.434643 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.434665 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.434702 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.434795 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:47Z","lastTransitionTime":"2025-10-06T08:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.442952 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.460563 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.475715 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.499456 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.508837 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/775167a6-c1d2-4436-867f-3cf3e9dedd3e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t6c85\" (UID: \"775167a6-c1d2-4436-867f-3cf3e9dedd3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.508880 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/775167a6-c1d2-4436-867f-3cf3e9dedd3e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t6c85\" (UID: \"775167a6-c1d2-4436-867f-3cf3e9dedd3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.508997 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/775167a6-c1d2-4436-867f-3cf3e9dedd3e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t6c85\" (UID: \"775167a6-c1d2-4436-867f-3cf3e9dedd3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.509054 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lwjx\" (UniqueName: \"kubernetes.io/projected/775167a6-c1d2-4436-867f-3cf3e9dedd3e-kube-api-access-7lwjx\") pod \"ovnkube-control-plane-749d76644c-t6c85\" (UID: \"775167a6-c1d2-4436-867f-3cf3e9dedd3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.514101 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.533413 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.541020 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.541085 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.541107 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.541136 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.541149 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:47Z","lastTransitionTime":"2025-10-06T08:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.552579 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.566278 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/1.log" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.566928 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/0.log" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.570008 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86" exitCode=1 Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.570077 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86"} Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.570143 4991 scope.go:117] "RemoveContainer" containerID="04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.571234 4991 scope.go:117] "RemoveContainer" containerID="a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86" Oct 06 08:19:47 crc kubenswrapper[4991]: E1006 08:19:47.571463 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.571586 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.590455 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.603562 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.610288 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/775167a6-c1d2-4436-867f-3cf3e9dedd3e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t6c85\" (UID: \"775167a6-c1d2-4436-867f-3cf3e9dedd3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.610405 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lwjx\" (UniqueName: \"kubernetes.io/projected/775167a6-c1d2-4436-867f-3cf3e9dedd3e-kube-api-access-7lwjx\") pod \"ovnkube-control-plane-749d76644c-t6c85\" (UID: \"775167a6-c1d2-4436-867f-3cf3e9dedd3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.610462 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/775167a6-c1d2-4436-867f-3cf3e9dedd3e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t6c85\" (UID: \"775167a6-c1d2-4436-867f-3cf3e9dedd3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.610516 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/775167a6-c1d2-4436-867f-3cf3e9dedd3e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t6c85\" (UID: \"775167a6-c1d2-4436-867f-3cf3e9dedd3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.611639 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/775167a6-c1d2-4436-867f-3cf3e9dedd3e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t6c85\" (UID: \"775167a6-c1d2-4436-867f-3cf3e9dedd3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.612203 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/775167a6-c1d2-4436-867f-3cf3e9dedd3e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t6c85\" (UID: \"775167a6-c1d2-4436-867f-3cf3e9dedd3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.616534 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.619415 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/775167a6-c1d2-4436-867f-3cf3e9dedd3e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t6c85\" (UID: \"775167a6-c1d2-4436-867f-3cf3e9dedd3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.633040 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lwjx\" (UniqueName: \"kubernetes.io/projected/775167a6-c1d2-4436-867f-3cf3e9dedd3e-kube-api-access-7lwjx\") pod \"ovnkube-control-plane-749d76644c-t6c85\" (UID: \"775167a6-c1d2-4436-867f-3cf3e9dedd3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.644526 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.644573 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.644582 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.644602 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.644613 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:47Z","lastTransitionTime":"2025-10-06T08:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.650890 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:44Z\\\",\\\"message\\\":\\\"I1006 08:19:44.604487 6453 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:44.604525 6453 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 08:19:44.604529 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 08:19:44.604535 6453 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:44.604544 6453 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:44.604554 6453 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:44.604562 6453 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:44.604859 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:19:44.604891 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:44.604897 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:44.604909 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:44.604929 6453 factory.go:656] Stopping watch factory\\\\nI1006 08:19:44.604935 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:19:44.604938 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:44.604945 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:44.604953 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.665724 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.680536 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.697413 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.704711 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.711088 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.727227 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.751202 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.751255 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.751268 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.751306 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.751320 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:47Z","lastTransitionTime":"2025-10-06T08:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.751781 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.766510 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.783577 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.812641 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.851137 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.857113 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.857495 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.857650 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.857772 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.857897 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:47Z","lastTransitionTime":"2025-10-06T08:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.878973 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.898008 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.916170 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.939399 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:44Z\\\",\\\"message\\\":\\\"I1006 08:19:44.604487 6453 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:44.604525 6453 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 08:19:44.604529 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 08:19:44.604535 6453 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:44.604544 6453 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:44.604554 6453 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:44.604562 6453 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:44.604859 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:19:44.604891 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:44.604897 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:44.604909 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:44.604929 6453 factory.go:656] Stopping watch factory\\\\nI1006 08:19:44.604935 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:19:44.604938 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:44.604945 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:44.604953 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:46Z\\\",\\\"message\\\":\\\"rnal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:19:46.756823 6657 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe48a0 0x1fe4580 0x1fe4520} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:19:46.756831 6657 services_controller.go:443] Built service openshift-kube-scheduler/scheduler LB cluster-wide conf\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.955676 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.960408 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.960444 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.960455 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.960479 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.960491 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:47Z","lastTransitionTime":"2025-10-06T08:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.971965 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:47 crc kubenswrapper[4991]: I1006 08:19:47.983190 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.001200 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.068034 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.068117 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.068135 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.068164 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.068186 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:48Z","lastTransitionTime":"2025-10-06T08:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.170857 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.171461 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.171476 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.171518 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.171531 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:48Z","lastTransitionTime":"2025-10-06T08:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.273670 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.273711 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.273722 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.273739 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.273750 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:48Z","lastTransitionTime":"2025-10-06T08:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.377049 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.377107 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.377120 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.377147 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.377162 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:48Z","lastTransitionTime":"2025-10-06T08:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.479824 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.479878 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.479894 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.479916 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.479928 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:48Z","lastTransitionTime":"2025-10-06T08:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.506977 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-787zw"] Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.507758 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:48 crc kubenswrapper[4991]: E1006 08:19:48.507864 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.521112 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.540783 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.554040 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.568578 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.576330 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/1.log" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.581968 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.582015 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.582034 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.582060 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.582081 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:48Z","lastTransitionTime":"2025-10-06T08:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.582368 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" event={"ID":"775167a6-c1d2-4436-867f-3cf3e9dedd3e","Type":"ContainerStarted","Data":"3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.582430 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" event={"ID":"775167a6-c1d2-4436-867f-3cf3e9dedd3e","Type":"ContainerStarted","Data":"827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.582447 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" event={"ID":"775167a6-c1d2-4436-867f-3cf3e9dedd3e","Type":"ContainerStarted","Data":"5f124e37192cfd771280bef4ebdce105430fcfecef78ca30db4fd18b233db378"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.603047 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.620007 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.621969 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.622018 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dggwl\" (UniqueName: \"kubernetes.io/projected/3e38e446-d0d7-463a-987a-110a8e95fe84-kube-api-access-dggwl\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.635126 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.652603 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.668993 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.682423 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.684649 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.684697 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.684710 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.684732 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.684745 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:48Z","lastTransitionTime":"2025-10-06T08:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.694980 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.708604 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.723567 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.723639 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dggwl\" (UniqueName: \"kubernetes.io/projected/3e38e446-d0d7-463a-987a-110a8e95fe84-kube-api-access-dggwl\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:48 crc kubenswrapper[4991]: E1006 08:19:48.723867 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:19:48 crc kubenswrapper[4991]: E1006 08:19:48.723984 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs podName:3e38e446-d0d7-463a-987a-110a8e95fe84 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:49.223952683 +0000 UTC m=+40.961702704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs") pod "network-metrics-daemon-787zw" (UID: "3e38e446-d0d7-463a-987a-110a8e95fe84") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.729169 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:44Z\\\",\\\"message\\\":\\\"I1006 08:19:44.604487 6453 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:44.604525 6453 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 08:19:44.604529 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 08:19:44.604535 6453 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:44.604544 6453 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:44.604554 6453 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:44.604562 6453 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:44.604859 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:19:44.604891 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:44.604897 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:44.604909 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:44.604929 6453 factory.go:656] Stopping watch factory\\\\nI1006 08:19:44.604935 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:19:44.604938 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:44.604945 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:44.604953 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:46Z\\\",\\\"message\\\":\\\"rnal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:19:46.756823 6657 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe48a0 0x1fe4580 0x1fe4520} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:19:46.756831 6657 services_controller.go:443] Built service openshift-kube-scheduler/scheduler LB cluster-wide conf\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.741784 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.744595 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dggwl\" (UniqueName: \"kubernetes.io/projected/3e38e446-d0d7-463a-987a-110a8e95fe84-kube-api-access-dggwl\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.756654 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.772363 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.785519 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.788209 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.788246 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.788261 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.788282 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.788314 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:48Z","lastTransitionTime":"2025-10-06T08:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.817487 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.831888 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.845390 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.861015 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.875379 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.891088 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.891145 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.891154 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.891169 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.891179 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:48Z","lastTransitionTime":"2025-10-06T08:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.898919 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.920464 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.940906 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.959218 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.977370 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.994367 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.994410 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.994425 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.994447 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.994462 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:48Z","lastTransitionTime":"2025-10-06T08:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:48 crc kubenswrapper[4991]: I1006 08:19:48.995214 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:48Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.006597 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.022454 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.044893 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:44Z\\\",\\\"message\\\":\\\"I1006 08:19:44.604487 6453 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:44.604525 6453 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 08:19:44.604529 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 08:19:44.604535 6453 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:44.604544 6453 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:44.604554 6453 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:44.604562 6453 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:44.604859 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:19:44.604891 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:44.604897 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:44.604909 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:44.604929 6453 factory.go:656] Stopping watch factory\\\\nI1006 08:19:44.604935 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:19:44.604938 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:44.604945 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:44.604953 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:46Z\\\",\\\"message\\\":\\\"rnal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:19:46.756823 6657 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe48a0 0x1fe4580 0x1fe4520} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:19:46.756831 6657 services_controller.go:443] Built service openshift-kube-scheduler/scheduler LB cluster-wide conf\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.061561 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.076954 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.097581 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.097624 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.097633 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.097649 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.097657 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:49Z","lastTransitionTime":"2025-10-06T08:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.100669 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.200205 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.200245 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.200254 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.200271 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.200283 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:49Z","lastTransitionTime":"2025-10-06T08:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.229003 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:49 crc kubenswrapper[4991]: E1006 08:19:49.229174 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:19:49 crc kubenswrapper[4991]: E1006 08:19:49.229233 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs podName:3e38e446-d0d7-463a-987a-110a8e95fe84 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:50.229216255 +0000 UTC m=+41.966966276 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs") pod "network-metrics-daemon-787zw" (UID: "3e38e446-d0d7-463a-987a-110a8e95fe84") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.242920 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.242920 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.242998 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:49 crc kubenswrapper[4991]: E1006 08:19:49.243160 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:49 crc kubenswrapper[4991]: E1006 08:19:49.243250 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:49 crc kubenswrapper[4991]: E1006 08:19:49.243324 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.261917 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.293344 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.304184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.304231 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.304242 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.304258 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.304268 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:49Z","lastTransitionTime":"2025-10-06T08:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.310324 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.326347 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.339875 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.355187 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.370912 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.385167 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.405475 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.407424 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.407481 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.407492 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.407514 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.407525 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:49Z","lastTransitionTime":"2025-10-06T08:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.419590 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.439592 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.461364 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.475080 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.488350 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.510097 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.510150 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.510165 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.510186 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.510197 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:49Z","lastTransitionTime":"2025-10-06T08:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.510396 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04e6234e1804890e9ad2c99f9cd69cac181b188975da4d1a2e7e61a5a5dfcd46\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:44Z\\\",\\\"message\\\":\\\"I1006 08:19:44.604487 6453 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:44.604525 6453 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 08:19:44.604529 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 08:19:44.604535 6453 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:44.604544 6453 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:44.604554 6453 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:44.604562 6453 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:44.604859 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 08:19:44.604891 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:44.604897 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:44.604909 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:44.604929 6453 factory.go:656] Stopping watch factory\\\\nI1006 08:19:44.604935 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 08:19:44.604938 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:44.604945 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:44.604953 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:46Z\\\",\\\"message\\\":\\\"rnal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:19:46.756823 6657 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe48a0 0x1fe4580 0x1fe4520} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:19:46.756831 6657 services_controller.go:443] Built service openshift-kube-scheduler/scheduler LB cluster-wide conf\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.524134 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.550046 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.612875 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.612918 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.612930 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.612946 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.612957 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:49Z","lastTransitionTime":"2025-10-06T08:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.716548 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.716611 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.716627 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.716645 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.716657 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:49Z","lastTransitionTime":"2025-10-06T08:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.819248 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.819365 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.819386 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.819423 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.819442 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:49Z","lastTransitionTime":"2025-10-06T08:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.922803 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.922880 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.922896 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.922917 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:49 crc kubenswrapper[4991]: I1006 08:19:49.922933 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:49Z","lastTransitionTime":"2025-10-06T08:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.026105 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.026158 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.026172 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.026190 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.026203 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:50Z","lastTransitionTime":"2025-10-06T08:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.130342 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.130417 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.130445 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.130477 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.130501 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:50Z","lastTransitionTime":"2025-10-06T08:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.233220 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.233342 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.233362 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.233391 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.233409 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:50Z","lastTransitionTime":"2025-10-06T08:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.239969 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:50 crc kubenswrapper[4991]: E1006 08:19:50.240261 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:19:50 crc kubenswrapper[4991]: E1006 08:19:50.240442 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs podName:3e38e446-d0d7-463a-987a-110a8e95fe84 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:52.240380015 +0000 UTC m=+43.978130066 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs") pod "network-metrics-daemon-787zw" (UID: "3e38e446-d0d7-463a-987a-110a8e95fe84") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.243741 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:50 crc kubenswrapper[4991]: E1006 08:19:50.243997 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.337926 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.337994 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.338010 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.338032 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.338050 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:50Z","lastTransitionTime":"2025-10-06T08:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.441052 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.441101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.441114 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.441133 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.441153 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:50Z","lastTransitionTime":"2025-10-06T08:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.544590 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.544677 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.544696 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.544725 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.544744 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:50Z","lastTransitionTime":"2025-10-06T08:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.648264 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.648362 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.648382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.648406 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.648424 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:50Z","lastTransitionTime":"2025-10-06T08:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.751917 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.751987 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.752011 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.752042 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.752064 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:50Z","lastTransitionTime":"2025-10-06T08:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.854485 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.854544 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.854555 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.854575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.854593 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:50Z","lastTransitionTime":"2025-10-06T08:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.957836 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.957896 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.957916 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.957940 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:50 crc kubenswrapper[4991]: I1006 08:19:50.957958 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:50Z","lastTransitionTime":"2025-10-06T08:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.061929 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.062002 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.062020 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.062051 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.062069 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:51Z","lastTransitionTime":"2025-10-06T08:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.165267 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.165382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.165408 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.165439 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.165460 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:51Z","lastTransitionTime":"2025-10-06T08:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.243479 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.243537 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.243495 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:51 crc kubenswrapper[4991]: E1006 08:19:51.243725 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:51 crc kubenswrapper[4991]: E1006 08:19:51.243931 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:51 crc kubenswrapper[4991]: E1006 08:19:51.244193 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.268618 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.268684 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.268701 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.268727 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.268750 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:51Z","lastTransitionTime":"2025-10-06T08:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.372551 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.372615 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.372631 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.372656 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.372673 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:51Z","lastTransitionTime":"2025-10-06T08:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.475665 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.475725 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.475741 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.475766 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.475787 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:51Z","lastTransitionTime":"2025-10-06T08:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.578756 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.578811 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.578822 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.578842 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.578855 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:51Z","lastTransitionTime":"2025-10-06T08:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.682093 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.682172 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.682199 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.682230 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.682252 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:51Z","lastTransitionTime":"2025-10-06T08:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.785637 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.785701 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.785711 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.785731 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.785742 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:51Z","lastTransitionTime":"2025-10-06T08:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.888730 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.888789 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.888805 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.888829 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.888850 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:51Z","lastTransitionTime":"2025-10-06T08:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.992432 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.992787 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.992939 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.993121 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:51 crc kubenswrapper[4991]: I1006 08:19:51.993988 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:51Z","lastTransitionTime":"2025-10-06T08:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.097975 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.098382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.098601 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.098741 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.098914 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:52Z","lastTransitionTime":"2025-10-06T08:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.202073 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.202149 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.202166 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.202190 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.202211 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:52Z","lastTransitionTime":"2025-10-06T08:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.243525 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:52 crc kubenswrapper[4991]: E1006 08:19:52.243726 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.265791 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:52 crc kubenswrapper[4991]: E1006 08:19:52.266009 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:19:52 crc kubenswrapper[4991]: E1006 08:19:52.266138 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs podName:3e38e446-d0d7-463a-987a-110a8e95fe84 nodeName:}" failed. No retries permitted until 2025-10-06 08:19:56.266108362 +0000 UTC m=+48.003858423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs") pod "network-metrics-daemon-787zw" (UID: "3e38e446-d0d7-463a-987a-110a8e95fe84") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.305140 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.305224 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.305242 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.305266 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.305285 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:52Z","lastTransitionTime":"2025-10-06T08:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.408659 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.408745 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.408767 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.408798 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.408823 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:52Z","lastTransitionTime":"2025-10-06T08:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.512083 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.512217 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.512284 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.512383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.512407 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:52Z","lastTransitionTime":"2025-10-06T08:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.615759 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.615851 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.615883 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.615911 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.615931 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:52Z","lastTransitionTime":"2025-10-06T08:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.718820 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.718864 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.718874 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.718890 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.718902 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:52Z","lastTransitionTime":"2025-10-06T08:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.822017 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.822106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.822141 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.822171 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.822208 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:52Z","lastTransitionTime":"2025-10-06T08:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.925447 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.925502 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.925514 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.925528 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:52 crc kubenswrapper[4991]: I1006 08:19:52.925537 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:52Z","lastTransitionTime":"2025-10-06T08:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.028982 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.029058 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.029083 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.029119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.029144 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:53Z","lastTransitionTime":"2025-10-06T08:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.131632 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.131695 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.131712 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.131736 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.131754 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:53Z","lastTransitionTime":"2025-10-06T08:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.234880 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.234948 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.234965 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.234996 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.235016 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:53Z","lastTransitionTime":"2025-10-06T08:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.243676 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.244370 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:53 crc kubenswrapper[4991]: E1006 08:19:53.244625 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.245109 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:53 crc kubenswrapper[4991]: E1006 08:19:53.245269 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:53 crc kubenswrapper[4991]: E1006 08:19:53.245551 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.338982 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.339043 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.339063 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.339091 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.339112 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:53Z","lastTransitionTime":"2025-10-06T08:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.442089 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.442157 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.442179 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.442209 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.442228 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:53Z","lastTransitionTime":"2025-10-06T08:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.545514 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.545584 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.545601 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.545626 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.545646 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:53Z","lastTransitionTime":"2025-10-06T08:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.648920 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.648989 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.649008 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.649039 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.649056 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:53Z","lastTransitionTime":"2025-10-06T08:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.752121 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.752171 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.752183 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.752202 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.752215 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:53Z","lastTransitionTime":"2025-10-06T08:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.855901 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.855966 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.855986 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.856010 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.856028 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:53Z","lastTransitionTime":"2025-10-06T08:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.959208 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.959284 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.959333 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.959356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:53 crc kubenswrapper[4991]: I1006 08:19:53.959373 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:53Z","lastTransitionTime":"2025-10-06T08:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.062803 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.062869 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.062888 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.062918 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.062936 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:54Z","lastTransitionTime":"2025-10-06T08:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.166505 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.166562 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.166574 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.166597 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.166648 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:54Z","lastTransitionTime":"2025-10-06T08:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.242998 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:54 crc kubenswrapper[4991]: E1006 08:19:54.243226 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.269803 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.269865 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.269882 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.269909 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.269928 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:54Z","lastTransitionTime":"2025-10-06T08:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.372963 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.373071 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.373090 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.373121 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.373142 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:54Z","lastTransitionTime":"2025-10-06T08:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.475755 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.475812 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.475825 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.475844 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.475856 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:54Z","lastTransitionTime":"2025-10-06T08:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.579227 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.579375 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.579395 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.579464 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.579485 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:54Z","lastTransitionTime":"2025-10-06T08:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.682089 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.682170 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.682225 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.682261 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.682288 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:54Z","lastTransitionTime":"2025-10-06T08:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.785100 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.785158 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.785170 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.785188 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.785202 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:54Z","lastTransitionTime":"2025-10-06T08:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.888220 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.888285 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.888315 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.888334 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.888346 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:54Z","lastTransitionTime":"2025-10-06T08:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.991238 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.991340 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.991359 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.991383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:54 crc kubenswrapper[4991]: I1006 08:19:54.991404 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:54Z","lastTransitionTime":"2025-10-06T08:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.094363 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.094439 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.094467 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.094497 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.094522 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.197476 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.197549 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.197567 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.197603 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.197622 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.243201 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.243214 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:55 crc kubenswrapper[4991]: E1006 08:19:55.243458 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:55 crc kubenswrapper[4991]: E1006 08:19:55.243642 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.243189 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:55 crc kubenswrapper[4991]: E1006 08:19:55.243778 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.301831 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.301889 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.301905 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.301928 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.301947 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.409188 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.409225 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.409263 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.409277 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.409287 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.512050 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.513229 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.513400 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.513545 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.513690 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.573538 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.573600 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.573614 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.573636 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.573652 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: E1006 08:19:55.594000 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.600776 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.600860 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.600879 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.600904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.600925 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: E1006 08:19:55.630259 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.636602 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.636644 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.636655 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.636671 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.636683 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: E1006 08:19:55.653915 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.658830 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.658860 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.658868 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.658881 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.658891 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: E1006 08:19:55.671560 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.675915 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.676020 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.676061 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.676092 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.676111 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: E1006 08:19:55.689592 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:55Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:55 crc kubenswrapper[4991]: E1006 08:19:55.689710 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.691988 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.692024 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.692034 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.692052 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.692064 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.795412 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.795484 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.795505 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.795529 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.795548 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.898806 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.898844 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.898855 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.898873 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:55 crc kubenswrapper[4991]: I1006 08:19:55.898887 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:55Z","lastTransitionTime":"2025-10-06T08:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.002461 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.002547 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.002571 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.002602 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.002626 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:56Z","lastTransitionTime":"2025-10-06T08:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.106353 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.106932 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.107238 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.107520 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.107714 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:56Z","lastTransitionTime":"2025-10-06T08:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.210816 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.210877 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.210893 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.210922 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.210942 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:56Z","lastTransitionTime":"2025-10-06T08:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.243959 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:56 crc kubenswrapper[4991]: E1006 08:19:56.244227 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.312685 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:56 crc kubenswrapper[4991]: E1006 08:19:56.312973 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.315212 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.315678 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.315949 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.316150 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.316433 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:56Z","lastTransitionTime":"2025-10-06T08:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:56 crc kubenswrapper[4991]: E1006 08:19:56.316978 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs podName:3e38e446-d0d7-463a-987a-110a8e95fe84 nodeName:}" failed. No retries permitted until 2025-10-06 08:20:04.315482137 +0000 UTC m=+56.053232198 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs") pod "network-metrics-daemon-787zw" (UID: "3e38e446-d0d7-463a-987a-110a8e95fe84") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.419951 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.420016 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.420040 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.420071 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.420094 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:56Z","lastTransitionTime":"2025-10-06T08:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.523985 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.524073 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.524098 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.524131 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.524154 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:56Z","lastTransitionTime":"2025-10-06T08:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.633027 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.633079 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.633096 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.633121 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.633136 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:56Z","lastTransitionTime":"2025-10-06T08:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.737623 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.737690 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.737705 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.737733 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.737748 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:56Z","lastTransitionTime":"2025-10-06T08:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.840897 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.840935 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.840943 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.840957 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.840967 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:56Z","lastTransitionTime":"2025-10-06T08:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.944346 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.944400 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.944416 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.944434 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:56 crc kubenswrapper[4991]: I1006 08:19:56.944445 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:56Z","lastTransitionTime":"2025-10-06T08:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.047831 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.047883 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.047899 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.047919 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.047932 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:57Z","lastTransitionTime":"2025-10-06T08:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.151495 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.151570 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.151589 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.151614 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.151630 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:57Z","lastTransitionTime":"2025-10-06T08:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.243887 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.243911 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:57 crc kubenswrapper[4991]: E1006 08:19:57.244102 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.244280 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:57 crc kubenswrapper[4991]: E1006 08:19:57.244493 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:57 crc kubenswrapper[4991]: E1006 08:19:57.244663 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.254327 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.254370 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.254382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.254397 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.254409 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:57Z","lastTransitionTime":"2025-10-06T08:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.358099 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.358174 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.358199 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.358230 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.358250 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:57Z","lastTransitionTime":"2025-10-06T08:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.461647 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.461706 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.461722 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.461747 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.461763 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:57Z","lastTransitionTime":"2025-10-06T08:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.565579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.565651 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.565670 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.565696 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.565714 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:57Z","lastTransitionTime":"2025-10-06T08:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.669742 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.669814 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.669836 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.669861 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.669884 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:57Z","lastTransitionTime":"2025-10-06T08:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.772842 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.772939 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.772961 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.772997 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.773021 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:57Z","lastTransitionTime":"2025-10-06T08:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.877514 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.877579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.877602 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.877637 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.877658 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:57Z","lastTransitionTime":"2025-10-06T08:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.981257 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.981361 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.981380 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.981404 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:57 crc kubenswrapper[4991]: I1006 08:19:57.981421 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:57Z","lastTransitionTime":"2025-10-06T08:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.084613 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.084673 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.084690 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.084714 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.084732 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:58Z","lastTransitionTime":"2025-10-06T08:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.188592 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.188658 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.188675 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.188702 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.188719 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:58Z","lastTransitionTime":"2025-10-06T08:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.243151 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:19:58 crc kubenswrapper[4991]: E1006 08:19:58.243369 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.244433 4991 scope.go:117] "RemoveContainer" containerID="a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.265543 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.288582 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.291715 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.291761 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.291773 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.291791 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.291802 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:58Z","lastTransitionTime":"2025-10-06T08:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.303801 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.323822 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.345732 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:46Z\\\",\\\"message\\\":\\\"rnal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:19:46.756823 6657 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe48a0 0x1fe4580 0x1fe4520} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:19:46.756831 6657 services_controller.go:443] Built service openshift-kube-scheduler/scheduler LB cluster-wide conf\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.359345 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.374565 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.394393 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.394893 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.394922 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.394929 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.394943 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.394952 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:58Z","lastTransitionTime":"2025-10-06T08:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.415560 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.433067 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.450244 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.461526 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.474396 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.491314 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.497428 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.497482 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.497494 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.497512 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.497524 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:58Z","lastTransitionTime":"2025-10-06T08:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.513489 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.529723 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.543328 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.600424 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.600486 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.600496 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.600514 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.600528 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:58Z","lastTransitionTime":"2025-10-06T08:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.622137 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/1.log" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.625987 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c"} Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.626201 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.646046 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.671440 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.693534 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.703427 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.703472 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.703483 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.703501 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.703512 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:58Z","lastTransitionTime":"2025-10-06T08:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.707506 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.726224 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.741898 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.762812 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.779652 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.794518 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.806527 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.806564 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.806575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.806594 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.806605 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:58Z","lastTransitionTime":"2025-10-06T08:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.813201 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.829331 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.848392 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.867171 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.885286 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.909268 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.909323 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.909333 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.909349 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.909359 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:58Z","lastTransitionTime":"2025-10-06T08:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.912425 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:46Z\\\",\\\"message\\\":\\\"rnal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:19:46.756823 6657 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe48a0 0x1fe4580 0x1fe4520} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:19:46.756831 6657 services_controller.go:443] Built service openshift-kube-scheduler/scheduler LB cluster-wide conf\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.930170 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:58 crc kubenswrapper[4991]: I1006 08:19:58.960881 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:58Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.012220 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.012267 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.012277 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.012309 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.012319 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:59Z","lastTransitionTime":"2025-10-06T08:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.115465 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.115516 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.115528 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.115549 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.115562 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:59Z","lastTransitionTime":"2025-10-06T08:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.218761 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.218829 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.218847 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.218873 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.218892 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:59Z","lastTransitionTime":"2025-10-06T08:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.243423 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.243497 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:19:59 crc kubenswrapper[4991]: E1006 08:19:59.243583 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.243658 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:19:59 crc kubenswrapper[4991]: E1006 08:19:59.243737 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:19:59 crc kubenswrapper[4991]: E1006 08:19:59.243842 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.262909 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.285791 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.306330 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.321378 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.321423 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.321435 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.321450 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.321462 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:59Z","lastTransitionTime":"2025-10-06T08:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.327947 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.347657 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.363621 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.378713 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.395374 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.418210 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:46Z\\\",\\\"message\\\":\\\"rnal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:19:46.756823 6657 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe48a0 0x1fe4580 0x1fe4520} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:19:46.756831 6657 services_controller.go:443] Built service openshift-kube-scheduler/scheduler LB cluster-wide conf\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.423584 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.423625 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.423637 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.423654 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.423665 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:59Z","lastTransitionTime":"2025-10-06T08:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.432277 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.449519 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.469288 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.525975 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.526282 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.526384 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.526455 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.526526 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:59Z","lastTransitionTime":"2025-10-06T08:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.543568 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.558819 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.573659 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.585581 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.598420 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.629086 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.629119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.629128 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.629142 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.629150 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:59Z","lastTransitionTime":"2025-10-06T08:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.631142 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/2.log" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.632130 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/1.log" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.634559 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c" exitCode=1 Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.634595 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c"} Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.634630 4991 scope.go:117] "RemoveContainer" containerID="a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.635359 4991 scope.go:117] "RemoveContainer" containerID="a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c" Oct 06 08:19:59 crc kubenswrapper[4991]: E1006 08:19:59.635532 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.654977 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.665750 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.679100 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.709367 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:46Z\\\",\\\"message\\\":\\\"rnal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:19:46.756823 6657 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe48a0 0x1fe4580 0x1fe4520} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:19:46.756831 6657 services_controller.go:443] Built service openshift-kube-scheduler/scheduler LB cluster-wide conf\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:59Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:59.150978 6849 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:59.150995 6849 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:59.151007 6849 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:59.151049 6849 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:59.151066 6849 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:59.151068 6849 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:59.151107 6849 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:19:59.151114 6849 factory.go:656] Stopping watch factory\\\\nI1006 08:19:59.151128 6849 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08:19:59.151284 6849 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 08:19:59.151425 6849 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 08:19:59.151480 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:59.151513 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 08:19:59.151615 6849 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.721659 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.732544 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.732585 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.732595 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.732614 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.732631 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:59Z","lastTransitionTime":"2025-10-06T08:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.734986 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.749640 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.764882 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.791436 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.807722 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.822751 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.835168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.835271 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.835341 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.835375 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.835393 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:59Z","lastTransitionTime":"2025-10-06T08:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.836318 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.851068 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.872876 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.891499 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.907854 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.921992 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:59Z is after 2025-08-24T17:21:41Z" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.938225 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.938285 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.938335 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.938363 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:19:59 crc kubenswrapper[4991]: I1006 08:19:59.938383 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:19:59Z","lastTransitionTime":"2025-10-06T08:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.041881 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.041969 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.041982 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.042002 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.042015 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:00Z","lastTransitionTime":"2025-10-06T08:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.145743 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.145804 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.145815 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.145835 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.145847 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:00Z","lastTransitionTime":"2025-10-06T08:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.239037 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.243867 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:00 crc kubenswrapper[4991]: E1006 08:20:00.244265 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.249533 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.249583 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.249601 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.249624 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.249642 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:00Z","lastTransitionTime":"2025-10-06T08:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.253129 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.260733 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.284066 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.302238 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.326649 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.352697 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.352759 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.352776 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.352803 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.352821 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:00Z","lastTransitionTime":"2025-10-06T08:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.359851 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:46Z\\\",\\\"message\\\":\\\"rnal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:19:46.756823 6657 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe48a0 0x1fe4580 0x1fe4520} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:19:46.756831 6657 services_controller.go:443] Built service openshift-kube-scheduler/scheduler LB cluster-wide conf\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:59Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:59.150978 6849 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:59.150995 6849 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:59.151007 6849 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:59.151049 6849 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:59.151066 6849 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:59.151068 6849 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:59.151107 6849 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:19:59.151114 6849 factory.go:656] Stopping watch factory\\\\nI1006 08:19:59.151128 6849 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08:19:59.151284 6849 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 08:19:59.151425 6849 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 08:19:59.151480 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:59.151513 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 08:19:59.151615 6849 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.380008 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.399428 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.420598 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.445713 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.456806 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.456897 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.456924 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.456960 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.456984 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:00Z","lastTransitionTime":"2025-10-06T08:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.470956 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.495098 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.514445 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.532360 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.553892 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.560190 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.560248 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.560330 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.560362 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.560380 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:00Z","lastTransitionTime":"2025-10-06T08:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.582570 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.604518 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.627017 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:00Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.641477 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/2.log" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.663698 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.663759 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.663778 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.663801 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.663819 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:00Z","lastTransitionTime":"2025-10-06T08:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.767984 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.768423 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.768782 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.769022 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.769274 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:00Z","lastTransitionTime":"2025-10-06T08:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.873292 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.873405 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.873427 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.873463 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.873486 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:00Z","lastTransitionTime":"2025-10-06T08:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.976632 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.976700 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.976719 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.976745 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:00 crc kubenswrapper[4991]: I1006 08:20:00.976763 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:00Z","lastTransitionTime":"2025-10-06T08:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.070085 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.070214 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.070257 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.070353 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:20:33.070278359 +0000 UTC m=+84.808028410 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.070408 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.070462 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.070472 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.070516 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.070578 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.070623 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.070654 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.070679 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.070635 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:20:33.070621948 +0000 UTC m=+84.808371969 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.070794 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:20:33.070755992 +0000 UTC m=+84.808506053 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.070837 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:20:33.070824554 +0000 UTC m=+84.808574605 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.071450 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.071497 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.071595 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:20:33.071567285 +0000 UTC m=+84.809317346 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.079671 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.079723 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.079736 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.079758 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.079812 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:01Z","lastTransitionTime":"2025-10-06T08:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.182428 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.182556 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.182573 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.182596 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.182614 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:01Z","lastTransitionTime":"2025-10-06T08:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.243561 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.243605 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.243771 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.243803 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.243941 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:01 crc kubenswrapper[4991]: E1006 08:20:01.244079 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.285186 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.285251 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.285267 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.285337 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.285374 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:01Z","lastTransitionTime":"2025-10-06T08:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.388239 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.388292 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.388326 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.388347 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.388363 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:01Z","lastTransitionTime":"2025-10-06T08:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.491125 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.491172 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.491184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.491200 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.491211 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:01Z","lastTransitionTime":"2025-10-06T08:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.594389 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.594458 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.594477 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.594502 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.594522 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:01Z","lastTransitionTime":"2025-10-06T08:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.698063 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.698129 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.698144 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.698166 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.698183 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:01Z","lastTransitionTime":"2025-10-06T08:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.801400 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.801472 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.801493 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.801516 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.801535 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:01Z","lastTransitionTime":"2025-10-06T08:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.905523 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.905601 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.905646 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.905676 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:01 crc kubenswrapper[4991]: I1006 08:20:01.905697 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:01Z","lastTransitionTime":"2025-10-06T08:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.009627 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.009705 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.009729 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.009761 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.009788 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:02Z","lastTransitionTime":"2025-10-06T08:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.113587 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.113749 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.113771 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.113803 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.113829 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:02Z","lastTransitionTime":"2025-10-06T08:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.217261 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.217328 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.217347 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.217367 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.217384 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:02Z","lastTransitionTime":"2025-10-06T08:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.243248 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:02 crc kubenswrapper[4991]: E1006 08:20:02.243422 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.327782 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.327888 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.327916 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.327946 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.327973 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:02Z","lastTransitionTime":"2025-10-06T08:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.431249 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.431323 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.431333 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.431352 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.431364 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:02Z","lastTransitionTime":"2025-10-06T08:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.535102 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.535352 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.535371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.535402 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.535417 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:02Z","lastTransitionTime":"2025-10-06T08:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.638713 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.638748 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.638762 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.638782 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.638796 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:02Z","lastTransitionTime":"2025-10-06T08:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.740892 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.740954 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.740965 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.740991 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.741005 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:02Z","lastTransitionTime":"2025-10-06T08:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.844270 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.844345 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.844359 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.844377 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.844390 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:02Z","lastTransitionTime":"2025-10-06T08:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.948133 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.948187 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.948201 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.948217 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:02 crc kubenswrapper[4991]: I1006 08:20:02.948229 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:02Z","lastTransitionTime":"2025-10-06T08:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.051051 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.051133 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.051151 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.051176 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.051195 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:03Z","lastTransitionTime":"2025-10-06T08:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.153951 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.154001 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.154012 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.154027 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.154038 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:03Z","lastTransitionTime":"2025-10-06T08:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.243672 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.243773 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:03 crc kubenswrapper[4991]: E1006 08:20:03.243814 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:03 crc kubenswrapper[4991]: E1006 08:20:03.243888 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.244048 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:03 crc kubenswrapper[4991]: E1006 08:20:03.244156 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.256263 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.256321 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.256334 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.256347 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.256355 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:03Z","lastTransitionTime":"2025-10-06T08:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.359774 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.359847 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.359857 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.359871 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.359915 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:03Z","lastTransitionTime":"2025-10-06T08:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.463134 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.463176 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.463188 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.463205 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.463214 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:03Z","lastTransitionTime":"2025-10-06T08:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.566263 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.566370 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.566393 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.566424 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.566446 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:03Z","lastTransitionTime":"2025-10-06T08:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.574155 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.594926 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52809b1f-2590-49ae-a8ee-62cc57f7924b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.614448 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.632748 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.652778 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.669023 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.669076 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.669093 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.669119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.669136 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:03Z","lastTransitionTime":"2025-10-06T08:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.676797 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9249aa628a10b85fd84ec83e5c9a01083b28c11874aac1447a15d1e0d982c86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:46Z\\\",\\\"message\\\":\\\"rnal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1006 08:19:46.756823 6657 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe48a0 0x1fe4580 0x1fe4520} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:19:46Z is after 2025-08-24T17:21:41Z]\\\\nI1006 08:19:46.756831 6657 services_controller.go:443] Built service openshift-kube-scheduler/scheduler LB cluster-wide conf\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:59Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:59.150978 6849 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:59.150995 6849 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:59.151007 6849 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:59.151049 6849 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:59.151066 6849 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:59.151068 6849 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:59.151107 6849 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:19:59.151114 6849 factory.go:656] Stopping watch factory\\\\nI1006 08:19:59.151128 6849 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08:19:59.151284 6849 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 08:19:59.151425 6849 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 08:19:59.151480 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:59.151513 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 08:19:59.151615 6849 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.692392 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.714003 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.736557 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.755105 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.771884 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.771927 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.771939 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.771956 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.771967 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:03Z","lastTransitionTime":"2025-10-06T08:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.781810 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.798509 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.814634 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.827398 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.842936 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.857427 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.872728 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.875797 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.875940 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.876226 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.876482 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.876626 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:03Z","lastTransitionTime":"2025-10-06T08:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.887291 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.908793 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:03Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.979482 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.980020 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.980152 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.980352 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:03 crc kubenswrapper[4991]: I1006 08:20:03.980510 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:03Z","lastTransitionTime":"2025-10-06T08:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.084373 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.084868 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.085095 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.085356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.085598 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:04Z","lastTransitionTime":"2025-10-06T08:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.189451 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.189880 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.190026 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.190214 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.190671 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:04Z","lastTransitionTime":"2025-10-06T08:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.243443 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:04 crc kubenswrapper[4991]: E1006 08:20:04.243660 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.293926 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.293995 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.294013 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.294037 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.294055 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:04Z","lastTransitionTime":"2025-10-06T08:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.397134 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.397211 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.397236 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.397269 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.397349 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:04Z","lastTransitionTime":"2025-10-06T08:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.408973 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:04 crc kubenswrapper[4991]: E1006 08:20:04.409265 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:20:04 crc kubenswrapper[4991]: E1006 08:20:04.409406 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs podName:3e38e446-d0d7-463a-987a-110a8e95fe84 nodeName:}" failed. No retries permitted until 2025-10-06 08:20:20.409376049 +0000 UTC m=+72.147126100 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs") pod "network-metrics-daemon-787zw" (UID: "3e38e446-d0d7-463a-987a-110a8e95fe84") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.500632 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.500690 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.500708 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.500733 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.500751 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:04Z","lastTransitionTime":"2025-10-06T08:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.604129 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.604534 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.604691 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.604862 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.605009 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:04Z","lastTransitionTime":"2025-10-06T08:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.708062 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.708171 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.708193 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.708218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.708234 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:04Z","lastTransitionTime":"2025-10-06T08:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.811559 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.811629 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.811647 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.811672 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.811690 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:04Z","lastTransitionTime":"2025-10-06T08:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.892856 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.894558 4991 scope.go:117] "RemoveContainer" containerID="a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c" Oct 06 08:20:04 crc kubenswrapper[4991]: E1006 08:20:04.895075 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.914933 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.914995 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.915014 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.915043 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.915063 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:04Z","lastTransitionTime":"2025-10-06T08:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.915894 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.940389 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.957039 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:04 crc kubenswrapper[4991]: I1006 08:20:04.974625 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:04Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.010954 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.018018 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.018077 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.018101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.018131 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.018153 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.033861 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.054165 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.077022 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.096773 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.111670 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.121125 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.121259 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.121280 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.121333 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.121352 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.127556 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.142758 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.176193 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:59Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:59.150978 6849 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:59.150995 6849 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:59.151007 6849 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:59.151049 6849 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:59.151066 6849 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:59.151068 6849 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:59.151107 6849 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:19:59.151114 6849 factory.go:656] Stopping watch factory\\\\nI1006 08:19:59.151128 6849 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08:19:59.151284 6849 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 08:19:59.151425 6849 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 08:19:59.151480 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:59.151513 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 08:19:59.151615 6849 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.196648 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.218609 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.224166 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.224572 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.224754 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.224910 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.225070 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.241766 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52809b1f-2590-49ae-a8ee-62cc57f7924b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.242915 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.243022 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.243095 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:05 crc kubenswrapper[4991]: E1006 08:20:05.243323 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:05 crc kubenswrapper[4991]: E1006 08:20:05.243536 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:05 crc kubenswrapper[4991]: E1006 08:20:05.243694 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.263924 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.283471 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.329486 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.330044 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.330218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.330603 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.330777 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.433751 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.434144 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.434354 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.434517 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.434639 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.537782 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.537822 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.537840 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.537865 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.537882 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.641866 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.641917 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.641939 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.641970 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.641993 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.735048 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.735114 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.735138 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.735163 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.735180 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: E1006 08:20:05.759238 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.765957 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.766017 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.766042 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.766070 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.766092 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: E1006 08:20:05.789281 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.795128 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.795191 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.795203 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.795231 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.795247 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: E1006 08:20:05.818196 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.825790 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.825851 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.825871 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.825899 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.825920 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: E1006 08:20:05.848156 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.854184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.854270 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.854291 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.855026 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.855092 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: E1006 08:20:05.876169 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:05Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:05 crc kubenswrapper[4991]: E1006 08:20:05.876422 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.878847 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.878896 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.878915 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.878935 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.878953 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.982225 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.982282 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.982336 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.982373 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:05 crc kubenswrapper[4991]: I1006 08:20:05.982398 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:05Z","lastTransitionTime":"2025-10-06T08:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.084618 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.084660 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.084670 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.084701 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.084716 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:06Z","lastTransitionTime":"2025-10-06T08:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.187459 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.187511 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.187525 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.187544 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.187556 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:06Z","lastTransitionTime":"2025-10-06T08:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.243059 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:06 crc kubenswrapper[4991]: E1006 08:20:06.243348 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.290670 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.290732 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.290751 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.290776 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.290794 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:06Z","lastTransitionTime":"2025-10-06T08:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.393927 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.394358 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.394656 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.394953 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.395454 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:06Z","lastTransitionTime":"2025-10-06T08:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.498622 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.499072 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.499219 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.499399 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.499532 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:06Z","lastTransitionTime":"2025-10-06T08:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.603194 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.603264 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.603286 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.603355 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.603379 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:06Z","lastTransitionTime":"2025-10-06T08:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.706537 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.706595 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.706607 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.706630 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.706645 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:06Z","lastTransitionTime":"2025-10-06T08:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.809787 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.809865 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.809882 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.809908 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.809926 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:06Z","lastTransitionTime":"2025-10-06T08:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.914505 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.914552 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.914565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.914590 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:06 crc kubenswrapper[4991]: I1006 08:20:06.914604 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:06Z","lastTransitionTime":"2025-10-06T08:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.018022 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.018472 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.018495 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.018526 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.018545 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:07Z","lastTransitionTime":"2025-10-06T08:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.122074 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.122144 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.122157 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.122176 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.122188 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:07Z","lastTransitionTime":"2025-10-06T08:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.226158 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.226232 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.226256 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.226287 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.226358 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:07Z","lastTransitionTime":"2025-10-06T08:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.243362 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:07 crc kubenswrapper[4991]: E1006 08:20:07.243647 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.243779 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.243790 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:07 crc kubenswrapper[4991]: E1006 08:20:07.244029 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:07 crc kubenswrapper[4991]: E1006 08:20:07.246797 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.329708 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.329788 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.329809 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.329840 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.329859 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:07Z","lastTransitionTime":"2025-10-06T08:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.433519 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.433573 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.433586 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.433608 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.433622 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:07Z","lastTransitionTime":"2025-10-06T08:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.537162 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.537209 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.537222 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.537242 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.537256 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:07Z","lastTransitionTime":"2025-10-06T08:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.640653 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.640738 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.640761 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.640788 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.640806 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:07Z","lastTransitionTime":"2025-10-06T08:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.744180 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.744246 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.744269 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.744321 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.744342 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:07Z","lastTransitionTime":"2025-10-06T08:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.847558 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.847938 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.848096 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.848249 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.848430 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:07Z","lastTransitionTime":"2025-10-06T08:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.951692 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.951756 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.951779 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.951807 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:07 crc kubenswrapper[4991]: I1006 08:20:07.951828 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:07Z","lastTransitionTime":"2025-10-06T08:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.054874 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.055239 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.055493 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.055691 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.055886 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:08Z","lastTransitionTime":"2025-10-06T08:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.159100 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.159167 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.159189 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.159219 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.159241 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:08Z","lastTransitionTime":"2025-10-06T08:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.243485 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:08 crc kubenswrapper[4991]: E1006 08:20:08.243664 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.262769 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.262829 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.262842 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.262861 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.262875 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:08Z","lastTransitionTime":"2025-10-06T08:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.364971 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.365018 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.365028 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.365044 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.365055 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:08Z","lastTransitionTime":"2025-10-06T08:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.468520 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.468587 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.468606 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.468632 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.468650 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:08Z","lastTransitionTime":"2025-10-06T08:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.572116 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.572185 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.572203 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.572232 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.572250 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:08Z","lastTransitionTime":"2025-10-06T08:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.675321 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.675382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.675395 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.675417 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.675431 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:08Z","lastTransitionTime":"2025-10-06T08:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.778386 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.778492 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.778511 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.778537 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.778555 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:08Z","lastTransitionTime":"2025-10-06T08:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.881865 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.881941 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.881964 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.881992 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.882013 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:08Z","lastTransitionTime":"2025-10-06T08:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.985633 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.985711 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.985740 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.985767 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:08 crc kubenswrapper[4991]: I1006 08:20:08.985789 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:08Z","lastTransitionTime":"2025-10-06T08:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.089238 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.089319 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.089333 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.089354 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.089368 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:09Z","lastTransitionTime":"2025-10-06T08:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.191911 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.192259 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.192365 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.192471 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.192566 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:09Z","lastTransitionTime":"2025-10-06T08:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.242855 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.242892 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.242863 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:09 crc kubenswrapper[4991]: E1006 08:20:09.243127 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:09 crc kubenswrapper[4991]: E1006 08:20:09.243260 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:09 crc kubenswrapper[4991]: E1006 08:20:09.243465 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.269060 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.293013 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.295926 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.296015 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.296046 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.296083 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.296109 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:09Z","lastTransitionTime":"2025-10-06T08:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.311078 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.345880 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.366680 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.385537 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.399433 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.399485 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.399506 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.399542 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.399560 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:09Z","lastTransitionTime":"2025-10-06T08:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.402281 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.423198 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.443961 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.464243 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.480083 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.503166 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.503228 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.503243 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.503268 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.503284 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:09Z","lastTransitionTime":"2025-10-06T08:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.504113 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:59Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:59.150978 6849 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:59.150995 6849 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:59.151007 6849 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:59.151049 6849 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:59.151066 6849 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:59.151068 6849 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:59.151107 6849 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:19:59.151114 6849 factory.go:656] Stopping watch factory\\\\nI1006 08:19:59.151128 6849 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08:19:59.151284 6849 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 08:19:59.151425 6849 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 08:19:59.151480 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:59.151513 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 08:19:59.151615 6849 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.524774 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.547361 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.566005 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52809b1f-2590-49ae-a8ee-62cc57f7924b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.585998 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.603736 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.612888 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.612975 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.612999 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.613026 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.613047 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:09Z","lastTransitionTime":"2025-10-06T08:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.627906 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:09Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.716658 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.716713 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.716733 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.716754 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.716770 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:09Z","lastTransitionTime":"2025-10-06T08:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.819688 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.819760 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.819770 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.819791 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.819801 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:09Z","lastTransitionTime":"2025-10-06T08:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.922737 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.922801 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.922814 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.922833 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:09 crc kubenswrapper[4991]: I1006 08:20:09.922847 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:09Z","lastTransitionTime":"2025-10-06T08:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.030371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.030461 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.030484 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.030515 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.030541 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:10Z","lastTransitionTime":"2025-10-06T08:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.133270 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.133323 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.133334 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.133349 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.133358 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:10Z","lastTransitionTime":"2025-10-06T08:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.236536 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.236579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.236588 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.236604 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.236614 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:10Z","lastTransitionTime":"2025-10-06T08:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.243521 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:10 crc kubenswrapper[4991]: E1006 08:20:10.243710 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.339580 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.339696 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.339724 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.339860 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.339967 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:10Z","lastTransitionTime":"2025-10-06T08:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.443826 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.444268 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.444390 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.444475 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.444534 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:10Z","lastTransitionTime":"2025-10-06T08:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.547914 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.548017 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.548040 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.548082 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.548111 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:10Z","lastTransitionTime":"2025-10-06T08:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.651791 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.651852 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.651869 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.651894 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.651914 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:10Z","lastTransitionTime":"2025-10-06T08:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.756082 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.756163 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.756190 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.756224 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.756250 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:10Z","lastTransitionTime":"2025-10-06T08:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.859126 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.859213 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.859240 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.859275 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.859334 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:10Z","lastTransitionTime":"2025-10-06T08:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.963648 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.963980 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.964100 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.964165 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:10 crc kubenswrapper[4991]: I1006 08:20:10.964231 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:10Z","lastTransitionTime":"2025-10-06T08:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.067459 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.067530 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.067548 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.067575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.067593 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:11Z","lastTransitionTime":"2025-10-06T08:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.171970 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.172006 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.172017 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.172032 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.172043 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:11Z","lastTransitionTime":"2025-10-06T08:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.242984 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.243067 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.243077 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:11 crc kubenswrapper[4991]: E1006 08:20:11.243206 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:11 crc kubenswrapper[4991]: E1006 08:20:11.243444 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:11 crc kubenswrapper[4991]: E1006 08:20:11.243943 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.275524 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.275591 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.275610 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.275635 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.275657 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:11Z","lastTransitionTime":"2025-10-06T08:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.379211 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.379350 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.379374 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.379404 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.379422 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:11Z","lastTransitionTime":"2025-10-06T08:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.482446 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.482514 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.482529 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.482552 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.482567 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:11Z","lastTransitionTime":"2025-10-06T08:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.585719 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.585767 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.585779 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.585799 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.585813 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:11Z","lastTransitionTime":"2025-10-06T08:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.689016 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.689408 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.689524 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.689700 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.689797 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:11Z","lastTransitionTime":"2025-10-06T08:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.793286 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.793367 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.793381 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.793402 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.793416 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:11Z","lastTransitionTime":"2025-10-06T08:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.897046 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.897098 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.897110 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.897130 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:11 crc kubenswrapper[4991]: I1006 08:20:11.897142 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:11Z","lastTransitionTime":"2025-10-06T08:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.000902 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.001467 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.001762 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.002069 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.002259 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:12Z","lastTransitionTime":"2025-10-06T08:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.106424 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.106511 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.106539 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.106572 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.106596 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:12Z","lastTransitionTime":"2025-10-06T08:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.209748 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.209902 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.209922 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.209947 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.209965 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:12Z","lastTransitionTime":"2025-10-06T08:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.243064 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:12 crc kubenswrapper[4991]: E1006 08:20:12.243252 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.312754 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.313175 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.313277 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.313426 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.315119 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:12Z","lastTransitionTime":"2025-10-06T08:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.418585 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.418654 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.418676 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.418703 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.418732 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:12Z","lastTransitionTime":"2025-10-06T08:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.522758 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.522837 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.522866 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.522899 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.522926 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:12Z","lastTransitionTime":"2025-10-06T08:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.626660 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.626724 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.626742 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.626770 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.626788 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:12Z","lastTransitionTime":"2025-10-06T08:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.731225 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.731284 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.731342 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.731377 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.731401 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:12Z","lastTransitionTime":"2025-10-06T08:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.834484 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.834524 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.834538 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.834562 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.834589 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:12Z","lastTransitionTime":"2025-10-06T08:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.937190 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.937407 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.937431 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.937471 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:12 crc kubenswrapper[4991]: I1006 08:20:12.937530 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:12Z","lastTransitionTime":"2025-10-06T08:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.040513 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.040741 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.040768 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.040809 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.040832 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:13Z","lastTransitionTime":"2025-10-06T08:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.143603 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.143653 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.143669 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.143692 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.143708 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:13Z","lastTransitionTime":"2025-10-06T08:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.243398 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.243499 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:13 crc kubenswrapper[4991]: E1006 08:20:13.243568 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:13 crc kubenswrapper[4991]: E1006 08:20:13.243936 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.244052 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:13 crc kubenswrapper[4991]: E1006 08:20:13.244263 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.245526 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.245553 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.245563 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.245580 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.245595 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:13Z","lastTransitionTime":"2025-10-06T08:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.348617 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.348671 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.348683 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.348705 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.348722 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:13Z","lastTransitionTime":"2025-10-06T08:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.451201 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.451236 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.451247 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.451262 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.451272 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:13Z","lastTransitionTime":"2025-10-06T08:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.555047 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.555093 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.555105 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.555126 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.555138 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:13Z","lastTransitionTime":"2025-10-06T08:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.657861 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.657901 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.657912 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.657931 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.657944 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:13Z","lastTransitionTime":"2025-10-06T08:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.760415 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.760483 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.760501 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.760524 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.760538 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:13Z","lastTransitionTime":"2025-10-06T08:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.863091 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.863123 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.863133 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.863178 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.863189 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:13Z","lastTransitionTime":"2025-10-06T08:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.966208 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.966332 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.966361 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.966398 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:13 crc kubenswrapper[4991]: I1006 08:20:13.966421 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:13Z","lastTransitionTime":"2025-10-06T08:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.069354 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.069426 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.069444 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.069476 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.069495 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:14Z","lastTransitionTime":"2025-10-06T08:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.172775 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.172853 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.172873 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.172904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.172922 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:14Z","lastTransitionTime":"2025-10-06T08:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.243162 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:14 crc kubenswrapper[4991]: E1006 08:20:14.243415 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.276211 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.276289 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.276351 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.276371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.276385 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:14Z","lastTransitionTime":"2025-10-06T08:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.380750 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.380795 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.380804 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.380822 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.380833 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:14Z","lastTransitionTime":"2025-10-06T08:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.483964 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.484010 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.484019 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.484035 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.484045 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:14Z","lastTransitionTime":"2025-10-06T08:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.587110 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.587165 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.587179 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.587199 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.587213 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:14Z","lastTransitionTime":"2025-10-06T08:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.690830 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.690910 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.690929 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.690959 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.690979 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:14Z","lastTransitionTime":"2025-10-06T08:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.794261 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.794331 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.794344 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.794364 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.794375 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:14Z","lastTransitionTime":"2025-10-06T08:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.897708 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.897779 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.897794 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.897818 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:14 crc kubenswrapper[4991]: I1006 08:20:14.897834 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:14Z","lastTransitionTime":"2025-10-06T08:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.001024 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.001078 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.001088 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.001107 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.001118 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.104660 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.104698 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.104710 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.104731 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.104745 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.208251 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.208326 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.208339 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.208358 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.208375 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.243437 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.243457 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.243549 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:15 crc kubenswrapper[4991]: E1006 08:20:15.243672 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:15 crc kubenswrapper[4991]: E1006 08:20:15.243788 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:15 crc kubenswrapper[4991]: E1006 08:20:15.244022 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.312050 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.312457 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.312673 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.312832 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.312949 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.416261 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.416608 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.416678 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.416777 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.416844 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.520223 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.520271 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.520283 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.520329 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.520345 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.623226 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.623308 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.623321 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.623363 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.623379 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.726324 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.726366 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.726374 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.726395 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.726405 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.829900 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.829999 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.830022 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.830048 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.830065 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.894789 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.894840 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.894850 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.894870 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.894882 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: E1006 08:20:15.908987 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:15Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.914023 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.914159 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.914177 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.914238 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.914259 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: E1006 08:20:15.928440 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:15Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.933381 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.933421 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.933432 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.933452 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.933465 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: E1006 08:20:15.950698 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:15Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.955959 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.956009 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.956019 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.956037 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.956051 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: E1006 08:20:15.968175 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:15Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.972474 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.972651 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.972741 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.972844 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.972936 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:15 crc kubenswrapper[4991]: E1006 08:20:15.986062 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:15Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:15 crc kubenswrapper[4991]: E1006 08:20:15.986192 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.988212 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.988365 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.988463 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.988558 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:15 crc kubenswrapper[4991]: I1006 08:20:15.988652 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:15Z","lastTransitionTime":"2025-10-06T08:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.092565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.092628 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.092642 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.092664 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.092678 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:16Z","lastTransitionTime":"2025-10-06T08:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.195988 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.196040 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.196050 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.196068 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.196085 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:16Z","lastTransitionTime":"2025-10-06T08:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.242976 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:16 crc kubenswrapper[4991]: E1006 08:20:16.243436 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.243703 4991 scope.go:117] "RemoveContainer" containerID="a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c" Oct 06 08:20:16 crc kubenswrapper[4991]: E1006 08:20:16.243892 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.298680 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.298740 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.298752 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.298772 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.298787 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:16Z","lastTransitionTime":"2025-10-06T08:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.401539 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.401599 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.401611 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.401633 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.401648 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:16Z","lastTransitionTime":"2025-10-06T08:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.504546 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.504586 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.504598 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.504615 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.504627 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:16Z","lastTransitionTime":"2025-10-06T08:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.607338 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.607384 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.607397 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.607416 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.607429 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:16Z","lastTransitionTime":"2025-10-06T08:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.711047 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.711251 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.711284 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.711379 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.711450 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:16Z","lastTransitionTime":"2025-10-06T08:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.814396 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.814455 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.814473 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.814497 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.814513 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:16Z","lastTransitionTime":"2025-10-06T08:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.917511 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.917557 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.917566 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.917582 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:16 crc kubenswrapper[4991]: I1006 08:20:16.917592 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:16Z","lastTransitionTime":"2025-10-06T08:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.019795 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.019836 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.019846 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.019861 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.019871 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:17Z","lastTransitionTime":"2025-10-06T08:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.122747 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.122801 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.122812 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.122832 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.122845 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:17Z","lastTransitionTime":"2025-10-06T08:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.225575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.225629 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.225640 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.225658 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.225670 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:17Z","lastTransitionTime":"2025-10-06T08:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.243059 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:17 crc kubenswrapper[4991]: E1006 08:20:17.243173 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.243243 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.243258 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:17 crc kubenswrapper[4991]: E1006 08:20:17.243501 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:17 crc kubenswrapper[4991]: E1006 08:20:17.243623 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.328568 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.328604 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.328615 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.328629 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.328638 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:17Z","lastTransitionTime":"2025-10-06T08:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.431106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.431157 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.431170 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.431189 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.431201 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:17Z","lastTransitionTime":"2025-10-06T08:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.533339 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.533385 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.533396 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.533410 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.533422 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:17Z","lastTransitionTime":"2025-10-06T08:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.636574 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.636624 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.636636 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.636652 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.636664 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:17Z","lastTransitionTime":"2025-10-06T08:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.740326 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.740373 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.740383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.740401 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.740410 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:17Z","lastTransitionTime":"2025-10-06T08:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.844066 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.844146 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.844164 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.844190 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.844208 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:17Z","lastTransitionTime":"2025-10-06T08:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.946853 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.947198 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.947208 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.947228 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:17 crc kubenswrapper[4991]: I1006 08:20:17.947239 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:17Z","lastTransitionTime":"2025-10-06T08:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.050222 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.050274 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.050325 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.050347 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.050364 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:18Z","lastTransitionTime":"2025-10-06T08:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.153674 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.153731 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.153742 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.153767 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.153779 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:18Z","lastTransitionTime":"2025-10-06T08:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.242993 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:18 crc kubenswrapper[4991]: E1006 08:20:18.243205 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.256922 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.256976 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.256990 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.257013 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.257030 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:18Z","lastTransitionTime":"2025-10-06T08:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.360707 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.360772 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.360981 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.361012 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.361033 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:18Z","lastTransitionTime":"2025-10-06T08:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.464918 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.465000 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.465019 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.465049 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.465072 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:18Z","lastTransitionTime":"2025-10-06T08:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.567676 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.567739 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.567757 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.567781 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.567799 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:18Z","lastTransitionTime":"2025-10-06T08:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.670818 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.670896 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.670915 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.670945 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.670964 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:18Z","lastTransitionTime":"2025-10-06T08:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.773954 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.774015 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.774029 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.774050 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.774064 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:18Z","lastTransitionTime":"2025-10-06T08:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.876972 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.877026 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.877034 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.877051 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.877062 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:18Z","lastTransitionTime":"2025-10-06T08:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.980546 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.980604 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.980617 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.980636 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:18 crc kubenswrapper[4991]: I1006 08:20:18.980645 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:18Z","lastTransitionTime":"2025-10-06T08:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.083092 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.083137 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.083146 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.083163 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.083173 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:19Z","lastTransitionTime":"2025-10-06T08:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.186605 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.186668 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.186683 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.186705 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.186719 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:19Z","lastTransitionTime":"2025-10-06T08:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.243511 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.243587 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:19 crc kubenswrapper[4991]: E1006 08:20:19.243696 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:19 crc kubenswrapper[4991]: E1006 08:20:19.243771 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.244120 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:19 crc kubenswrapper[4991]: E1006 08:20:19.244217 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.268192 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.283533 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.288995 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.289039 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.289071 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.289088 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.289098 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:19Z","lastTransitionTime":"2025-10-06T08:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.297150 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.310823 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.323958 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.339808 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.354596 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.368481 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.383021 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.391338 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.391398 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.391413 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.391451 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.391462 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:19Z","lastTransitionTime":"2025-10-06T08:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.398218 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.411688 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52809b1f-2590-49ae-a8ee-62cc57f7924b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.426125 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.438201 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.452783 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.472533 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:59Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:59.150978 6849 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:59.150995 6849 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:59.151007 6849 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:59.151049 6849 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:59.151066 6849 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:59.151068 6849 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:59.151107 6849 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:19:59.151114 6849 factory.go:656] Stopping watch factory\\\\nI1006 08:19:59.151128 6849 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08:19:59.151284 6849 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 08:19:59.151425 6849 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 08:19:59.151480 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:59.151513 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 08:19:59.151615 6849 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.486507 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.496975 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.497040 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.497065 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.497099 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.497126 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:19Z","lastTransitionTime":"2025-10-06T08:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.501796 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.517870 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:19Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.599718 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.599801 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.599823 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.599856 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.599880 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:19Z","lastTransitionTime":"2025-10-06T08:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.702899 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.702948 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.702960 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.702981 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.703008 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:19Z","lastTransitionTime":"2025-10-06T08:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.806079 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.806128 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.806166 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.806191 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.806209 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:19Z","lastTransitionTime":"2025-10-06T08:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.908796 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.908846 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.908864 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.908886 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:19 crc kubenswrapper[4991]: I1006 08:20:19.908902 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:19Z","lastTransitionTime":"2025-10-06T08:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.012473 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.012522 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.012541 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.012571 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.012588 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:20Z","lastTransitionTime":"2025-10-06T08:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.115255 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.115336 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.115348 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.115369 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.115383 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:20Z","lastTransitionTime":"2025-10-06T08:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.218246 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.218287 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.218317 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.218337 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.218348 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:20Z","lastTransitionTime":"2025-10-06T08:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.243492 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:20 crc kubenswrapper[4991]: E1006 08:20:20.243682 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.320837 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.320905 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.320929 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.320958 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.320980 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:20Z","lastTransitionTime":"2025-10-06T08:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.423408 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.423466 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.423478 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.423498 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.423512 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:20Z","lastTransitionTime":"2025-10-06T08:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.500579 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:20 crc kubenswrapper[4991]: E1006 08:20:20.500855 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:20:20 crc kubenswrapper[4991]: E1006 08:20:20.500961 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs podName:3e38e446-d0d7-463a-987a-110a8e95fe84 nodeName:}" failed. No retries permitted until 2025-10-06 08:20:52.500938613 +0000 UTC m=+104.238688634 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs") pod "network-metrics-daemon-787zw" (UID: "3e38e446-d0d7-463a-987a-110a8e95fe84") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.526205 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.526251 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.526264 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.526284 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.526313 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:20Z","lastTransitionTime":"2025-10-06T08:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.629108 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.629160 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.629175 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.629193 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.629206 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:20Z","lastTransitionTime":"2025-10-06T08:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.731814 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.731922 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.731942 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.732469 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.732528 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:20Z","lastTransitionTime":"2025-10-06T08:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.834560 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.834596 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.834607 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.834622 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.834633 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:20Z","lastTransitionTime":"2025-10-06T08:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.936668 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.936711 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.936724 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.936743 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:20 crc kubenswrapper[4991]: I1006 08:20:20.936754 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:20Z","lastTransitionTime":"2025-10-06T08:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.038630 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.038674 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.038683 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.038697 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.038707 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:21Z","lastTransitionTime":"2025-10-06T08:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.140521 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.140585 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.140596 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.140612 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.140622 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:21Z","lastTransitionTime":"2025-10-06T08:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.243261 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.243352 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.243433 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:21 crc kubenswrapper[4991]: E1006 08:20:21.243453 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:21 crc kubenswrapper[4991]: E1006 08:20:21.243523 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:21 crc kubenswrapper[4991]: E1006 08:20:21.243570 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.243689 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.243749 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.243767 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.243796 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.243816 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:21Z","lastTransitionTime":"2025-10-06T08:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.346539 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.346578 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.346588 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.346605 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.346617 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:21Z","lastTransitionTime":"2025-10-06T08:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.449550 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.449627 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.449646 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.449672 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.449694 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:21Z","lastTransitionTime":"2025-10-06T08:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.554026 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.554122 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.554149 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.554181 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.554204 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:21Z","lastTransitionTime":"2025-10-06T08:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.657366 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.657433 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.657454 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.657483 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.657551 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:21Z","lastTransitionTime":"2025-10-06T08:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.761183 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.761241 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.761259 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.761284 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.761329 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:21Z","lastTransitionTime":"2025-10-06T08:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.863877 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.863942 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.863961 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.863989 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.864008 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:21Z","lastTransitionTime":"2025-10-06T08:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.967811 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.967885 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.967904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.967930 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:21 crc kubenswrapper[4991]: I1006 08:20:21.967952 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:21Z","lastTransitionTime":"2025-10-06T08:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.070891 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.070942 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.070969 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.070987 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.070997 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:22Z","lastTransitionTime":"2025-10-06T08:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.173766 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.173812 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.173825 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.173850 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.173864 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:22Z","lastTransitionTime":"2025-10-06T08:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.242944 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:22 crc kubenswrapper[4991]: E1006 08:20:22.243132 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.276580 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.276649 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.276668 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.276697 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.276716 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:22Z","lastTransitionTime":"2025-10-06T08:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.379333 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.379392 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.379404 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.379421 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.379430 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:22Z","lastTransitionTime":"2025-10-06T08:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.482213 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.482278 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.482338 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.482389 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.482408 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:22Z","lastTransitionTime":"2025-10-06T08:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.584992 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.585058 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.585075 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.585101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.585115 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:22Z","lastTransitionTime":"2025-10-06T08:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.688379 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.688438 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.688451 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.688471 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.688484 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:22Z","lastTransitionTime":"2025-10-06T08:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.791579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.791678 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.791691 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.791713 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.791730 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:22Z","lastTransitionTime":"2025-10-06T08:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.895503 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.895855 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.896057 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.896195 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:22 crc kubenswrapper[4991]: I1006 08:20:22.896358 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:22Z","lastTransitionTime":"2025-10-06T08:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:22.999962 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.000019 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.000032 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.000054 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.000070 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:23Z","lastTransitionTime":"2025-10-06T08:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.103218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.103264 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.103272 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.103288 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.103313 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:23Z","lastTransitionTime":"2025-10-06T08:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.208699 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.208765 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.208782 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.208809 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.208828 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:23Z","lastTransitionTime":"2025-10-06T08:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.243504 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.243532 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.243571 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:23 crc kubenswrapper[4991]: E1006 08:20:23.243675 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:23 crc kubenswrapper[4991]: E1006 08:20:23.243823 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:23 crc kubenswrapper[4991]: E1006 08:20:23.244032 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.311577 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.311647 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.311671 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.311696 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.311715 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:23Z","lastTransitionTime":"2025-10-06T08:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.415008 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.415065 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.415082 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.415106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.415125 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:23Z","lastTransitionTime":"2025-10-06T08:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.519058 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.519116 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.519134 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.519160 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.519179 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:23Z","lastTransitionTime":"2025-10-06T08:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.621834 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.621909 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.621923 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.621946 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.621960 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:23Z","lastTransitionTime":"2025-10-06T08:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.741865 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjvmw_58386a1a-6047-42ce-a952-43f397822919/kube-multus/0.log" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.741969 4991 generic.go:334] "Generic (PLEG): container finished" podID="58386a1a-6047-42ce-a952-43f397822919" containerID="688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793" exitCode=1 Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.742023 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjvmw" event={"ID":"58386a1a-6047-42ce-a952-43f397822919","Type":"ContainerDied","Data":"688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793"} Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.742790 4991 scope.go:117] "RemoveContainer" containerID="688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.748119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.748186 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.748211 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.748243 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.748266 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:23Z","lastTransitionTime":"2025-10-06T08:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.771193 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.798594 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52809b1f-2590-49ae-a8ee-62cc57f7924b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.814080 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.825470 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.839675 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:23Z\\\",\\\"message\\\":\\\"2025-10-06T08:19:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd\\\\n2025-10-06T08:19:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd to /host/opt/cni/bin/\\\\n2025-10-06T08:19:38Z [verbose] multus-daemon started\\\\n2025-10-06T08:19:38Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:20:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.851411 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.851467 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.851480 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.851503 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.851520 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:23Z","lastTransitionTime":"2025-10-06T08:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.862812 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:59Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:59.150978 6849 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:59.150995 6849 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:59.151007 6849 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:59.151049 6849 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:59.151066 6849 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:59.151068 6849 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:59.151107 6849 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:19:59.151114 6849 factory.go:656] Stopping watch factory\\\\nI1006 08:19:59.151128 6849 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08:19:59.151284 6849 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 08:19:59.151425 6849 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 08:19:59.151480 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:59.151513 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 08:19:59.151615 6849 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.876716 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.891061 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.908105 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.932026 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.943593 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.953890 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.953958 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.953976 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.954002 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.954022 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:23Z","lastTransitionTime":"2025-10-06T08:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.956243 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.967772 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.981437 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:23 crc kubenswrapper[4991]: I1006 08:20:23.997071 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:23Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.016266 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.032836 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.051200 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.057077 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.057210 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.057335 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.057441 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.057564 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:24Z","lastTransitionTime":"2025-10-06T08:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.161101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.161166 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.161180 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.161200 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.161217 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:24Z","lastTransitionTime":"2025-10-06T08:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.243346 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:24 crc kubenswrapper[4991]: E1006 08:20:24.243539 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.263535 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.263584 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.263593 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.263613 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.263625 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:24Z","lastTransitionTime":"2025-10-06T08:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.366620 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.366675 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.366692 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.366713 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.366729 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:24Z","lastTransitionTime":"2025-10-06T08:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.470689 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.471048 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.471255 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.471438 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.471609 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:24Z","lastTransitionTime":"2025-10-06T08:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.575974 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.576042 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.576062 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.576092 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.576111 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:24Z","lastTransitionTime":"2025-10-06T08:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.680084 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.680186 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.680203 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.680233 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.680251 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:24Z","lastTransitionTime":"2025-10-06T08:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.749245 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjvmw_58386a1a-6047-42ce-a952-43f397822919/kube-multus/0.log" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.749383 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjvmw" event={"ID":"58386a1a-6047-42ce-a952-43f397822919","Type":"ContainerStarted","Data":"e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771"} Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.775933 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.783641 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.783697 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.783715 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.783739 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.783757 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:24Z","lastTransitionTime":"2025-10-06T08:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.796420 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.854005 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.876460 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.886682 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.886775 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.886799 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.886827 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.886847 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:24Z","lastTransitionTime":"2025-10-06T08:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.900998 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.917992 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.934252 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.955158 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.974121 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.992362 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.992454 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.992471 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.992489 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.992523 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:24Z","lastTransitionTime":"2025-10-06T08:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:24 crc kubenswrapper[4991]: I1006 08:20:24.993151 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:24Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.006946 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:25Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.018234 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52809b1f-2590-49ae-a8ee-62cc57f7924b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:25Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.035072 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:25Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.052534 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:25Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.065861 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:23Z\\\",\\\"message\\\":\\\"2025-10-06T08:19:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd\\\\n2025-10-06T08:19:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd to /host/opt/cni/bin/\\\\n2025-10-06T08:19:38Z [verbose] multus-daemon started\\\\n2025-10-06T08:19:38Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:20:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:25Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.086908 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:59Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:59.150978 6849 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:59.150995 6849 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:59.151007 6849 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:59.151049 6849 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:59.151066 6849 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:59.151068 6849 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:59.151107 6849 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:19:59.151114 6849 factory.go:656] Stopping watch factory\\\\nI1006 08:19:59.151128 6849 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08:19:59.151284 6849 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 08:19:59.151425 6849 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 08:19:59.151480 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:59.151513 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 08:19:59.151615 6849 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:25Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.095885 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.095934 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.095945 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.095963 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.095977 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:25Z","lastTransitionTime":"2025-10-06T08:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.102785 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:25Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.116115 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:25Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.199235 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.199371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.199395 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.199471 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.199493 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:25Z","lastTransitionTime":"2025-10-06T08:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.243067 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.243174 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:25 crc kubenswrapper[4991]: E1006 08:20:25.243256 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:25 crc kubenswrapper[4991]: E1006 08:20:25.243443 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.243622 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:25 crc kubenswrapper[4991]: E1006 08:20:25.243714 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.302854 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.302919 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.302937 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.302963 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.302985 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:25Z","lastTransitionTime":"2025-10-06T08:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.406517 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.406585 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.406670 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.406699 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.406718 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:25Z","lastTransitionTime":"2025-10-06T08:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.511401 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.511483 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.511502 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.511534 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.511553 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:25Z","lastTransitionTime":"2025-10-06T08:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.614351 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.614439 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.614453 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.614475 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.614488 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:25Z","lastTransitionTime":"2025-10-06T08:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.717784 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.717829 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.717839 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.717856 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.717867 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:25Z","lastTransitionTime":"2025-10-06T08:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.821168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.821230 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.821251 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.821281 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.821400 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:25Z","lastTransitionTime":"2025-10-06T08:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.925498 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.925552 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.925565 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.925585 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:25 crc kubenswrapper[4991]: I1006 08:20:25.925598 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:25Z","lastTransitionTime":"2025-10-06T08:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.029766 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.029843 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.029867 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.029898 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.029924 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.134351 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.134466 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.134488 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.134521 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.134544 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.238505 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.238586 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.238599 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.238661 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.238684 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.243833 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:26 crc kubenswrapper[4991]: E1006 08:20:26.244049 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.342492 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.342606 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.342632 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.342664 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.342691 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.383139 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.383325 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.383348 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.383374 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.383392 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: E1006 08:20:26.406039 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:26Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.410743 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.410781 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.410794 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.410812 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.410824 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: E1006 08:20:26.433284 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:26Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.439409 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.439478 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.439498 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.439526 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.439548 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: E1006 08:20:26.464325 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:26Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.469238 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.469285 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.469327 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.469349 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.469362 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: E1006 08:20:26.486100 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:26Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.491513 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.491563 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.491582 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.491609 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.491628 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: E1006 08:20:26.512637 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:26Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:26 crc kubenswrapper[4991]: E1006 08:20:26.512938 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.515138 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.515210 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.515228 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.515254 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.515275 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.618905 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.618992 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.619020 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.619056 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.619081 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.722660 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.722752 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.722777 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.722808 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.722831 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.825899 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.825942 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.825953 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.825974 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.825986 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.929432 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.929497 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.929521 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.929547 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:26 crc kubenswrapper[4991]: I1006 08:20:26.929565 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:26Z","lastTransitionTime":"2025-10-06T08:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.036391 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.036453 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.036465 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.036485 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.036498 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:27Z","lastTransitionTime":"2025-10-06T08:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.140041 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.140119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.140138 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.140430 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.140509 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:27Z","lastTransitionTime":"2025-10-06T08:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.243254 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.243260 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:27 crc kubenswrapper[4991]: E1006 08:20:27.243520 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:27 crc kubenswrapper[4991]: E1006 08:20:27.243699 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.243793 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:27 crc kubenswrapper[4991]: E1006 08:20:27.243929 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.243956 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.244018 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.244039 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.244068 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.244086 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:27Z","lastTransitionTime":"2025-10-06T08:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.245345 4991 scope.go:117] "RemoveContainer" containerID="a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.347260 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.347346 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.347365 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.347390 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.347409 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:27Z","lastTransitionTime":"2025-10-06T08:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.451053 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.451093 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.451109 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.451134 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.451153 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:27Z","lastTransitionTime":"2025-10-06T08:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.554703 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.554759 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.554778 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.554810 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.554837 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:27Z","lastTransitionTime":"2025-10-06T08:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.658457 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.658502 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.658520 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.658547 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.658567 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:27Z","lastTransitionTime":"2025-10-06T08:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.762466 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.762521 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.762538 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.762559 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.762575 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:27Z","lastTransitionTime":"2025-10-06T08:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.765275 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/2.log" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.767948 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c"} Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.769458 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.798324 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.818883 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.835183 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.849700 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.865031 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.865069 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.865080 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.865097 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.865109 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:27Z","lastTransitionTime":"2025-10-06T08:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.868506 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.882951 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.898078 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.913503 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.931074 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.944165 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.960327 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52809b1f-2590-49ae-a8ee-62cc57f7924b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.967570 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.967608 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.967617 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.967633 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.967643 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:27Z","lastTransitionTime":"2025-10-06T08:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.979546 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:27 crc kubenswrapper[4991]: I1006 08:20:27.991476 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:27Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.007722 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:23Z\\\",\\\"message\\\":\\\"2025-10-06T08:19:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd\\\\n2025-10-06T08:19:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd to /host/opt/cni/bin/\\\\n2025-10-06T08:19:38Z [verbose] multus-daemon started\\\\n2025-10-06T08:19:38Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:20:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.030233 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:59Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:59.150978 6849 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:59.150995 6849 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:59.151007 6849 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:59.151049 6849 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:59.151066 6849 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:59.151068 6849 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:59.151107 6849 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:19:59.151114 6849 factory.go:656] Stopping watch factory\\\\nI1006 08:19:59.151128 6849 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08:19:59.151284 6849 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 08:19:59.151425 6849 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 08:19:59.151480 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:59.151513 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 08:19:59.151615 6849 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.049903 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.064533 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.069577 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.069619 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.069630 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.069647 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.069913 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:28Z","lastTransitionTime":"2025-10-06T08:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.084213 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.173026 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.173078 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.173118 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.173148 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.173161 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:28Z","lastTransitionTime":"2025-10-06T08:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.244137 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:28 crc kubenswrapper[4991]: E1006 08:20:28.244388 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.257983 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.276119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.276196 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.276214 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.276321 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.276342 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:28Z","lastTransitionTime":"2025-10-06T08:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.379501 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.379554 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.379564 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.379585 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.379596 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:28Z","lastTransitionTime":"2025-10-06T08:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.483589 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.483656 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.483674 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.483699 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.483717 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:28Z","lastTransitionTime":"2025-10-06T08:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.587097 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.587179 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.587205 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.587240 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.587266 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:28Z","lastTransitionTime":"2025-10-06T08:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.691029 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.691101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.691118 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.691146 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.691163 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:28Z","lastTransitionTime":"2025-10-06T08:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.774081 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/3.log" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.775112 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/2.log" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.779254 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" exitCode=1 Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.780150 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c"} Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.780244 4991 scope.go:117] "RemoveContainer" containerID="a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.781293 4991 scope.go:117] "RemoveContainer" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" Oct 06 08:20:28 crc kubenswrapper[4991]: E1006 08:20:28.781601 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.799128 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.799185 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.799209 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.799457 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.799481 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:28Z","lastTransitionTime":"2025-10-06T08:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.820817 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.844288 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.864814 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.884645 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.905379 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.905162 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:59Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:59.150978 6849 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:59.150995 6849 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:59.151007 6849 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:59.151049 6849 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:59.151066 6849 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:59.151068 6849 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:59.151107 6849 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:19:59.151114 6849 factory.go:656] Stopping watch factory\\\\nI1006 08:19:59.151128 6849 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08:19:59.151284 6849 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 08:19:59.151425 6849 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 08:19:59.151480 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:59.151513 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 08:19:59.151615 6849 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:28Z\\\",\\\"message\\\":\\\"t lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.21\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 08:20:28.248472 7200 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-pgn9b\\\\nI1006 08:20:28.248822 7200 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-pgn9b\\\\nF1006 08:20:28.248824 7200 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.905416 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.905573 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.905616 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.905642 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:28Z","lastTransitionTime":"2025-10-06T08:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.921474 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.936147 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.951207 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52809b1f-2590-49ae-a8ee-62cc57f7924b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.968356 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.979684 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:28 crc kubenswrapper[4991]: I1006 08:20:28.995052 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:23Z\\\",\\\"message\\\":\\\"2025-10-06T08:19:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd\\\\n2025-10-06T08:19:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd to /host/opt/cni/bin/\\\\n2025-10-06T08:19:38Z [verbose] multus-daemon started\\\\n2025-10-06T08:19:38Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:20:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:28Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.006652 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.008239 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.008382 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.008455 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.008531 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.008594 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:29Z","lastTransitionTime":"2025-10-06T08:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.023334 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.038518 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.054074 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e5c759-8037-476e-9cb0-d31a36cbbde6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fae28e1f9e34b6670b19842581b89981626f77f1e3cec07a7c9a4610557c86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d125182810217335e9e760bad80f33e4018c631aaf4dfc1374950a888102ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d125182810217335e9e760bad80f33e4018c631aaf4dfc1374950a888102ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.086049 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.100624 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.110672 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.110718 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.110726 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.110745 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.110755 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:29Z","lastTransitionTime":"2025-10-06T08:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.117590 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.127587 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.213941 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.214002 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.214021 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.214049 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.214069 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:29Z","lastTransitionTime":"2025-10-06T08:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.243740 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.244285 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:29 crc kubenswrapper[4991]: E1006 08:20:29.244514 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.244759 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:29 crc kubenswrapper[4991]: E1006 08:20:29.244936 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:29 crc kubenswrapper[4991]: E1006 08:20:29.245352 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.260847 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.277577 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e5c759-8037-476e-9cb0-d31a36cbbde6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fae28e1f9e34b6670b19842581b89981626f77f1e3cec07a7c9a4610557c86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d125182810217335e9e760bad80f33e4018c631aaf4dfc1374950a888102ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d125182810217335e9e760bad80f33e4018c631aaf4dfc1374950a888102ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.317172 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.318395 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.318504 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.318530 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.318614 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.318686 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:29Z","lastTransitionTime":"2025-10-06T08:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.339547 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.353888 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.364524 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.379856 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.403601 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.422418 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.422682 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.422873 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.423291 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.423738 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:29Z","lastTransitionTime":"2025-10-06T08:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.424018 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.446500 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.481079 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b7b4cf0f7fd4ee56dd59c0cba40db2207b76ed889aea3226652092874b4d9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:19:59Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 08:19:59.150978 6849 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 08:19:59.150995 6849 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 08:19:59.151007 6849 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 08:19:59.151049 6849 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 08:19:59.151066 6849 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 08:19:59.151068 6849 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 08:19:59.151088 6849 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 08:19:59.151107 6849 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 08:19:59.151114 6849 factory.go:656] Stopping watch factory\\\\nI1006 08:19:59.151128 6849 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 08:19:59.151284 6849 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 08:19:59.151425 6849 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 08:19:59.151480 6849 ovnkube.go:599] Stopped ovnkube\\\\nI1006 08:19:59.151513 6849 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 08:19:59.151615 6849 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:28Z\\\",\\\"message\\\":\\\"t lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.21\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 08:20:28.248472 7200 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-pgn9b\\\\nI1006 08:20:28.248822 7200 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-pgn9b\\\\nF1006 08:20:28.248824 7200 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.501766 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.524919 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.527166 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.527231 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.527258 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.527334 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.527363 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:29Z","lastTransitionTime":"2025-10-06T08:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.546494 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52809b1f-2590-49ae-a8ee-62cc57f7924b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.568244 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.584185 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.604717 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:23Z\\\",\\\"message\\\":\\\"2025-10-06T08:19:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd\\\\n2025-10-06T08:19:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd to /host/opt/cni/bin/\\\\n2025-10-06T08:19:38Z [verbose] multus-daemon started\\\\n2025-10-06T08:19:38Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:20:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.625526 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.631312 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.631492 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.631559 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.631660 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.631751 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:29Z","lastTransitionTime":"2025-10-06T08:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.659775 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.734766 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.734828 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.734840 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.734862 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.734876 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:29Z","lastTransitionTime":"2025-10-06T08:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.787785 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/3.log" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.794657 4991 scope.go:117] "RemoveContainer" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" Oct 06 08:20:29 crc kubenswrapper[4991]: E1006 08:20:29.794977 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.814904 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.840292 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.840469 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.840495 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.840561 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.840583 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:29Z","lastTransitionTime":"2025-10-06T08:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.841984 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.864284 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.881443 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.896483 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.914001 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.928634 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e5c759-8037-476e-9cb0-d31a36cbbde6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fae28e1f9e34b6670b19842581b89981626f77f1e3cec07a7c9a4610557c86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d125182810217335e9e760bad80f33e4018c631aaf4dfc1374950a888102ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d125182810217335e9e760bad80f33e4018c631aaf4dfc1374950a888102ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.943943 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.944029 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.944047 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.944077 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.944096 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:29Z","lastTransitionTime":"2025-10-06T08:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.964008 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:29 crc kubenswrapper[4991]: I1006 08:20:29.982933 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.000857 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:29Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.025264 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.046416 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.048798 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.048891 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.048916 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.048939 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.048955 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:30Z","lastTransitionTime":"2025-10-06T08:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.063209 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.080629 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.104146 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:23Z\\\",\\\"message\\\":\\\"2025-10-06T08:19:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd\\\\n2025-10-06T08:19:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd to /host/opt/cni/bin/\\\\n2025-10-06T08:19:38Z [verbose] multus-daemon started\\\\n2025-10-06T08:19:38Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:20:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.132342 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:28Z\\\",\\\"message\\\":\\\"t lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.21\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 08:20:28.248472 7200 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-pgn9b\\\\nI1006 08:20:28.248822 7200 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-pgn9b\\\\nF1006 08:20:28.248824 7200 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:20:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.152647 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.152690 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.152702 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.152721 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.152735 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:30Z","lastTransitionTime":"2025-10-06T08:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.153375 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.172339 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.190323 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52809b1f-2590-49ae-a8ee-62cc57f7924b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:30Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.242830 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:30 crc kubenswrapper[4991]: E1006 08:20:30.242988 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.256672 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.256750 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.256770 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.256801 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.256820 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:30Z","lastTransitionTime":"2025-10-06T08:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.359638 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.359707 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.359725 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.359754 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.359775 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:30Z","lastTransitionTime":"2025-10-06T08:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.462487 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.462553 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.462570 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.462596 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.462615 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:30Z","lastTransitionTime":"2025-10-06T08:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.566953 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.567020 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.567044 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.567072 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.567091 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:30Z","lastTransitionTime":"2025-10-06T08:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.670253 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.670335 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.670356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.670383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.670403 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:30Z","lastTransitionTime":"2025-10-06T08:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.774184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.774281 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.774334 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.774360 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.774380 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:30Z","lastTransitionTime":"2025-10-06T08:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.877503 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.877556 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.877574 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.877599 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.877613 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:30Z","lastTransitionTime":"2025-10-06T08:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.980771 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.980834 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.980843 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.980863 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:30 crc kubenswrapper[4991]: I1006 08:20:30.980873 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:30Z","lastTransitionTime":"2025-10-06T08:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.083859 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.083930 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.083952 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.084018 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.084044 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:31Z","lastTransitionTime":"2025-10-06T08:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.187748 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.187839 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.187863 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.187897 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.187922 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:31Z","lastTransitionTime":"2025-10-06T08:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.243141 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.243205 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.243221 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:31 crc kubenswrapper[4991]: E1006 08:20:31.243430 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:31 crc kubenswrapper[4991]: E1006 08:20:31.243621 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:31 crc kubenswrapper[4991]: E1006 08:20:31.243807 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.291609 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.291700 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.291733 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.291763 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.291786 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:31Z","lastTransitionTime":"2025-10-06T08:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.395487 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.395526 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.395534 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.395552 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.395561 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:31Z","lastTransitionTime":"2025-10-06T08:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.499044 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.499131 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.499150 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.499182 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.499200 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:31Z","lastTransitionTime":"2025-10-06T08:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.601917 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.601981 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.602000 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.602025 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.602043 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:31Z","lastTransitionTime":"2025-10-06T08:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.705811 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.705894 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.705913 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.705943 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.705963 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:31Z","lastTransitionTime":"2025-10-06T08:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.809259 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.809350 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.809362 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.809386 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.809400 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:31Z","lastTransitionTime":"2025-10-06T08:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.913246 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.913408 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.913435 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.913469 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:31 crc kubenswrapper[4991]: I1006 08:20:31.913491 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:31Z","lastTransitionTime":"2025-10-06T08:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.016898 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.016970 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.016985 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.017009 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.017025 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:32Z","lastTransitionTime":"2025-10-06T08:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.120533 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.120621 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.120650 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.120682 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.120706 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:32Z","lastTransitionTime":"2025-10-06T08:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.223932 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.224019 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.224046 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.224079 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.224099 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:32Z","lastTransitionTime":"2025-10-06T08:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.243030 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:32 crc kubenswrapper[4991]: E1006 08:20:32.243406 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.327679 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.327740 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.327758 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.327785 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.327819 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:32Z","lastTransitionTime":"2025-10-06T08:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.431209 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.431280 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.431339 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.431367 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.431386 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:32Z","lastTransitionTime":"2025-10-06T08:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.534810 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.534871 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.534888 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.534913 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.534930 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:32Z","lastTransitionTime":"2025-10-06T08:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.638035 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.638112 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.638130 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.638159 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.638178 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:32Z","lastTransitionTime":"2025-10-06T08:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.741197 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.741270 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.741287 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.741349 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.741369 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:32Z","lastTransitionTime":"2025-10-06T08:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.845387 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.845454 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.845472 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.845504 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.845522 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:32Z","lastTransitionTime":"2025-10-06T08:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.948639 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.948733 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.948774 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.948816 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:32 crc kubenswrapper[4991]: I1006 08:20:32.948845 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:32Z","lastTransitionTime":"2025-10-06T08:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.052681 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.052730 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.052741 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.052757 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.052767 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:33Z","lastTransitionTime":"2025-10-06T08:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.153538 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.153620 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.153680 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:37.153646796 +0000 UTC m=+148.891396847 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.153734 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.153785 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.153801 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.153802 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.153814 4991 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.153849 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.153855 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 08:21:37.153845892 +0000 UTC m=+148.891595913 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.154007 4991 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.154061 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:21:37.154045817 +0000 UTC m=+148.891795878 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.154199 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.154250 4991 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.154273 4991 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.154265 4991 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.154405 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 08:21:37.154375598 +0000 UTC m=+148.892125799 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.154576 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 08:21:37.154542522 +0000 UTC m=+148.892292583 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.155411 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.155452 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.155466 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.155482 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.155494 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:33Z","lastTransitionTime":"2025-10-06T08:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.243773 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.243925 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.243964 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.243773 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.244191 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:33 crc kubenswrapper[4991]: E1006 08:20:33.244282 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.258092 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.258167 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.258187 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.258214 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.258236 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:33Z","lastTransitionTime":"2025-10-06T08:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.361786 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.361878 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.361914 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.361950 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.361977 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:33Z","lastTransitionTime":"2025-10-06T08:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.466590 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.466685 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.466717 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.466752 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.466784 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:33Z","lastTransitionTime":"2025-10-06T08:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.569720 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.569807 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.569821 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.569841 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.569856 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:33Z","lastTransitionTime":"2025-10-06T08:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.673003 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.673040 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.673052 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.673071 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.673084 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:33Z","lastTransitionTime":"2025-10-06T08:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.775889 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.775962 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.775972 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.775989 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.775999 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:33Z","lastTransitionTime":"2025-10-06T08:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.879161 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.879226 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.879241 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.879265 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.879278 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:33Z","lastTransitionTime":"2025-10-06T08:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.982788 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.982866 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.982884 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.982911 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:33 crc kubenswrapper[4991]: I1006 08:20:33.982930 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:33Z","lastTransitionTime":"2025-10-06T08:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.087099 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.087171 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.087184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.087203 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.087214 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:34Z","lastTransitionTime":"2025-10-06T08:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.190707 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.190810 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.190838 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.190878 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.190911 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:34Z","lastTransitionTime":"2025-10-06T08:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.243746 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:34 crc kubenswrapper[4991]: E1006 08:20:34.243996 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.293995 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.294065 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.294085 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.294113 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.294131 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:34Z","lastTransitionTime":"2025-10-06T08:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.397716 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.397781 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.397800 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.397829 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.397848 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:34Z","lastTransitionTime":"2025-10-06T08:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.501259 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.501358 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.501377 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.501410 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.501430 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:34Z","lastTransitionTime":"2025-10-06T08:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.604596 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.604682 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.604713 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.604746 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.604769 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:34Z","lastTransitionTime":"2025-10-06T08:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.708116 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.708188 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.708225 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.708252 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.708271 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:34Z","lastTransitionTime":"2025-10-06T08:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.813878 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.813978 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.814001 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.814063 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.814082 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:34Z","lastTransitionTime":"2025-10-06T08:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.917885 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.917931 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.917949 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.917973 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:34 crc kubenswrapper[4991]: I1006 08:20:34.917992 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:34Z","lastTransitionTime":"2025-10-06T08:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.021361 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.021428 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.021446 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.021476 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.021496 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:35Z","lastTransitionTime":"2025-10-06T08:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.124800 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.124873 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.124891 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.124924 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.124944 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:35Z","lastTransitionTime":"2025-10-06T08:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.230254 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.230369 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.230392 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.230419 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.230439 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:35Z","lastTransitionTime":"2025-10-06T08:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.244573 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:35 crc kubenswrapper[4991]: E1006 08:20:35.244766 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.244998 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:35 crc kubenswrapper[4991]: E1006 08:20:35.245091 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.245339 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:35 crc kubenswrapper[4991]: E1006 08:20:35.245482 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.334495 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.334587 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.334614 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.334647 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.334669 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:35Z","lastTransitionTime":"2025-10-06T08:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.438170 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.438230 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.438248 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.438272 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.438289 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:35Z","lastTransitionTime":"2025-10-06T08:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.541657 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.541749 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.541771 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.541796 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.541815 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:35Z","lastTransitionTime":"2025-10-06T08:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.644625 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.644668 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.644702 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.644719 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.644757 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:35Z","lastTransitionTime":"2025-10-06T08:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.749175 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.749215 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.749226 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.749243 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.749259 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:35Z","lastTransitionTime":"2025-10-06T08:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.852815 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.852900 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.852919 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.852949 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.852969 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:35Z","lastTransitionTime":"2025-10-06T08:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.956674 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.956757 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.956773 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.956804 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:35 crc kubenswrapper[4991]: I1006 08:20:35.956823 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:35Z","lastTransitionTime":"2025-10-06T08:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.060394 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.060483 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.060506 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.060540 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.060567 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.163471 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.163526 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.163552 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.163578 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.163595 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.242965 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:36 crc kubenswrapper[4991]: E1006 08:20:36.243172 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.265863 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.265918 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.265936 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.265960 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.265979 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.368637 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.368694 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.368709 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.368729 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.368743 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.471873 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.472174 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.472265 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.472411 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.472503 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.576196 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.576266 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.576284 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.576345 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.576364 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.606773 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.606851 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.606881 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.606915 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.606938 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: E1006 08:20:36.629346 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.635229 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.635283 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.635340 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.635371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.635393 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: E1006 08:20:36.656202 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.661552 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.661625 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.661651 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.661682 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.661711 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: E1006 08:20:36.681472 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.686479 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.686524 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.686542 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.686566 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.686583 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: E1006 08:20:36.707359 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.712893 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.712934 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.712948 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.712966 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.712979 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: E1006 08:20:36.732482 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:36Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:36 crc kubenswrapper[4991]: E1006 08:20:36.732740 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.735242 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.735330 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.735357 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.735383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.735400 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.838225 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.838318 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.838337 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.838368 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.838387 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.942658 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.942735 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.942758 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.942789 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:36 crc kubenswrapper[4991]: I1006 08:20:36.942814 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:36Z","lastTransitionTime":"2025-10-06T08:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.047801 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.048123 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.048337 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.048430 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.048514 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:37Z","lastTransitionTime":"2025-10-06T08:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.151405 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.151726 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.151862 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.152004 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.152092 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:37Z","lastTransitionTime":"2025-10-06T08:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.242871 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.242945 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.242991 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:37 crc kubenswrapper[4991]: E1006 08:20:37.243160 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:37 crc kubenswrapper[4991]: E1006 08:20:37.243366 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:37 crc kubenswrapper[4991]: E1006 08:20:37.243524 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.253916 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.254081 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.254167 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.254236 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.254311 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:37Z","lastTransitionTime":"2025-10-06T08:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.358285 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.358420 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.358440 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.358469 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.358490 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:37Z","lastTransitionTime":"2025-10-06T08:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.461409 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.461448 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.461458 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.461494 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.461503 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:37Z","lastTransitionTime":"2025-10-06T08:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.565162 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.565234 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.565257 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.565285 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.565370 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:37Z","lastTransitionTime":"2025-10-06T08:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.668582 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.668657 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.668678 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.668708 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.668728 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:37Z","lastTransitionTime":"2025-10-06T08:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.772338 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.772414 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.772433 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.772463 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.772482 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:37Z","lastTransitionTime":"2025-10-06T08:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.876147 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.877006 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.877093 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.877223 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.877349 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:37Z","lastTransitionTime":"2025-10-06T08:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.981805 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.981894 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.981917 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.981953 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:37 crc kubenswrapper[4991]: I1006 08:20:37.981980 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:37Z","lastTransitionTime":"2025-10-06T08:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.085608 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.085680 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.085703 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.085733 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.085754 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:38Z","lastTransitionTime":"2025-10-06T08:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.189032 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.189089 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.189101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.189119 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.189134 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:38Z","lastTransitionTime":"2025-10-06T08:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.243075 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:38 crc kubenswrapper[4991]: E1006 08:20:38.243252 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.292702 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.292777 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.292795 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.292823 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.292841 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:38Z","lastTransitionTime":"2025-10-06T08:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.395794 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.396238 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.396428 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.396576 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.396693 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:38Z","lastTransitionTime":"2025-10-06T08:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.500028 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.500106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.500131 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.500166 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.500193 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:38Z","lastTransitionTime":"2025-10-06T08:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.603598 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.604076 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.604354 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.604554 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.604758 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:38Z","lastTransitionTime":"2025-10-06T08:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.707820 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.708245 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.708570 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.708774 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.708986 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:38Z","lastTransitionTime":"2025-10-06T08:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.812372 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.812434 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.812448 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.812463 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.812472 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:38Z","lastTransitionTime":"2025-10-06T08:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.915136 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.915238 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.915255 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.915280 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:38 crc kubenswrapper[4991]: I1006 08:20:38.915336 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:38Z","lastTransitionTime":"2025-10-06T08:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.018957 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.019038 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.019050 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.019072 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.019086 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:39Z","lastTransitionTime":"2025-10-06T08:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.121956 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.122021 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.122033 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.122055 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.122068 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:39Z","lastTransitionTime":"2025-10-06T08:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.225076 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.225154 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.225175 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.225208 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.225230 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:39Z","lastTransitionTime":"2025-10-06T08:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.243846 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.243950 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:39 crc kubenswrapper[4991]: E1006 08:20:39.244063 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.244116 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:39 crc kubenswrapper[4991]: E1006 08:20:39.244278 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:39 crc kubenswrapper[4991]: E1006 08:20:39.244372 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.260918 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.281499 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.292952 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.308872 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.322574 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.328587 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.328626 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.328641 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.328661 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.328706 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:39Z","lastTransitionTime":"2025-10-06T08:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.342600 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:23Z\\\",\\\"message\\\":\\\"2025-10-06T08:19:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd\\\\n2025-10-06T08:19:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd to /host/opt/cni/bin/\\\\n2025-10-06T08:19:38Z [verbose] multus-daemon started\\\\n2025-10-06T08:19:38Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:20:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.364645 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:28Z\\\",\\\"message\\\":\\\"t lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.21\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 08:20:28.248472 7200 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-pgn9b\\\\nI1006 08:20:28.248822 7200 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-pgn9b\\\\nF1006 08:20:28.248824 7200 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:20:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.379363 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.393723 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.406410 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52809b1f-2590-49ae-a8ee-62cc57f7924b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.421745 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.433503 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.434151 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.434184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.434254 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.434287 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:39Z","lastTransitionTime":"2025-10-06T08:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.443359 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.461222 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.477596 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.494433 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.509016 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.524792 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e5c759-8037-476e-9cb0-d31a36cbbde6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fae28e1f9e34b6670b19842581b89981626f77f1e3cec07a7c9a4610557c86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d125182810217335e9e760bad80f33e4018c631aaf4dfc1374950a888102ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d125182810217335e9e760bad80f33e4018c631aaf4dfc1374950a888102ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.537865 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.537927 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.537947 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.537973 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.537992 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:39Z","lastTransitionTime":"2025-10-06T08:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.562408 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.581722 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:39Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.640781 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.640896 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.640919 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.640947 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.640967 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:39Z","lastTransitionTime":"2025-10-06T08:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.745280 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.745376 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.745397 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.745426 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.745447 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:39Z","lastTransitionTime":"2025-10-06T08:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.848814 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.848907 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.848932 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.848956 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.848974 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:39Z","lastTransitionTime":"2025-10-06T08:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.952114 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.952163 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.952181 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.952203 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:39 crc kubenswrapper[4991]: I1006 08:20:39.952220 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:39Z","lastTransitionTime":"2025-10-06T08:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.055636 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.055752 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.055773 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.055842 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.055863 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:40Z","lastTransitionTime":"2025-10-06T08:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.159555 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.159648 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.159742 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.159797 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.159825 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:40Z","lastTransitionTime":"2025-10-06T08:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.243010 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:40 crc kubenswrapper[4991]: E1006 08:20:40.243273 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.263995 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.264118 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.264139 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.264164 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.264182 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:40Z","lastTransitionTime":"2025-10-06T08:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.367247 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.367338 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.367356 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.367386 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.367405 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:40Z","lastTransitionTime":"2025-10-06T08:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.470420 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.470499 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.470517 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.470544 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.470565 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:40Z","lastTransitionTime":"2025-10-06T08:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.573487 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.573550 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.573567 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.573595 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.573613 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:40Z","lastTransitionTime":"2025-10-06T08:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.676218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.676265 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.676280 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.676331 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.676351 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:40Z","lastTransitionTime":"2025-10-06T08:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.779241 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.779353 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.779371 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.779397 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.779416 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:40Z","lastTransitionTime":"2025-10-06T08:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.882683 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.882759 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.882778 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.882806 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.882827 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:40Z","lastTransitionTime":"2025-10-06T08:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.986679 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.986763 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.986789 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.986823 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:40 crc kubenswrapper[4991]: I1006 08:20:40.986845 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:40Z","lastTransitionTime":"2025-10-06T08:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.089828 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.089880 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.089898 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.089919 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.089936 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:41Z","lastTransitionTime":"2025-10-06T08:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.192132 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.192202 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.192219 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.192245 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.192264 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:41Z","lastTransitionTime":"2025-10-06T08:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.243653 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.243691 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.243810 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:41 crc kubenswrapper[4991]: E1006 08:20:41.244548 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:41 crc kubenswrapper[4991]: E1006 08:20:41.244662 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:41 crc kubenswrapper[4991]: E1006 08:20:41.244768 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.244963 4991 scope.go:117] "RemoveContainer" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" Oct 06 08:20:41 crc kubenswrapper[4991]: E1006 08:20:41.245264 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.295470 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.295526 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.295543 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.295563 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.295580 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:41Z","lastTransitionTime":"2025-10-06T08:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.398774 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.398864 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.398883 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.398901 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.398916 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:41Z","lastTransitionTime":"2025-10-06T08:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.502364 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.502419 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.502439 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.502462 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.502478 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:41Z","lastTransitionTime":"2025-10-06T08:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.606383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.606502 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.606524 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.606546 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.606562 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:41Z","lastTransitionTime":"2025-10-06T08:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.709182 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.709241 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.709254 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.709274 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.709287 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:41Z","lastTransitionTime":"2025-10-06T08:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.812665 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.812733 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.812752 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.812782 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.812799 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:41Z","lastTransitionTime":"2025-10-06T08:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.916433 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.916488 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.916505 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.916531 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:41 crc kubenswrapper[4991]: I1006 08:20:41.916550 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:41Z","lastTransitionTime":"2025-10-06T08:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.019974 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.020038 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.020055 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.020080 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.020098 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:42Z","lastTransitionTime":"2025-10-06T08:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.123341 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.123426 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.123448 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.123479 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.123501 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:42Z","lastTransitionTime":"2025-10-06T08:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.226500 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.226579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.226601 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.226628 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.226648 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:42Z","lastTransitionTime":"2025-10-06T08:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.243147 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:42 crc kubenswrapper[4991]: E1006 08:20:42.243404 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.329977 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.330065 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.330084 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.330114 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.330144 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:42Z","lastTransitionTime":"2025-10-06T08:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.432637 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.432711 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.432733 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.432767 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.432791 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:42Z","lastTransitionTime":"2025-10-06T08:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.535000 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.535054 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.535063 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.535079 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.535090 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:42Z","lastTransitionTime":"2025-10-06T08:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.638645 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.638727 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.638763 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.638785 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.638800 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:42Z","lastTransitionTime":"2025-10-06T08:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.742107 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.742181 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.742192 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.742209 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.742222 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:42Z","lastTransitionTime":"2025-10-06T08:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.844648 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.844718 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.844731 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.844750 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.844761 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:42Z","lastTransitionTime":"2025-10-06T08:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.948345 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.948417 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.948427 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.948446 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:42 crc kubenswrapper[4991]: I1006 08:20:42.948478 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:42Z","lastTransitionTime":"2025-10-06T08:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.058045 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.058123 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.058157 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.058190 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.058211 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:43Z","lastTransitionTime":"2025-10-06T08:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.162355 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.162402 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.162411 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.162432 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.162443 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:43Z","lastTransitionTime":"2025-10-06T08:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.243600 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.243644 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.243644 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:43 crc kubenswrapper[4991]: E1006 08:20:43.243812 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:43 crc kubenswrapper[4991]: E1006 08:20:43.244179 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:43 crc kubenswrapper[4991]: E1006 08:20:43.244309 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.265180 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.265251 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.265283 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.265348 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.265373 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:43Z","lastTransitionTime":"2025-10-06T08:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.369335 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.369546 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.369624 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.369656 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.369712 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:43Z","lastTransitionTime":"2025-10-06T08:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.473904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.474082 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.474106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.474168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.474187 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:43Z","lastTransitionTime":"2025-10-06T08:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.577796 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.577876 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.577896 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.577924 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.577943 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:43Z","lastTransitionTime":"2025-10-06T08:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.681554 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.681625 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.681644 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.681688 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.681708 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:43Z","lastTransitionTime":"2025-10-06T08:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.784840 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.784933 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.784954 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.785013 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.785033 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:43Z","lastTransitionTime":"2025-10-06T08:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.888700 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.888766 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.888783 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.888806 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.888823 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:43Z","lastTransitionTime":"2025-10-06T08:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.992507 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.992595 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.992620 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.992655 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:43 crc kubenswrapper[4991]: I1006 08:20:43.992682 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:43Z","lastTransitionTime":"2025-10-06T08:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.095615 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.095669 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.095683 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.095698 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.095709 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:44Z","lastTransitionTime":"2025-10-06T08:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.200169 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.200230 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.200247 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.200271 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.200290 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:44Z","lastTransitionTime":"2025-10-06T08:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.243771 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:44 crc kubenswrapper[4991]: E1006 08:20:44.243967 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.304033 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.304092 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.304109 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.304134 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.304158 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:44Z","lastTransitionTime":"2025-10-06T08:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.407891 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.407949 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.407967 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.407994 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.408013 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:44Z","lastTransitionTime":"2025-10-06T08:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.510832 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.510897 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.510914 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.510940 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.510958 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:44Z","lastTransitionTime":"2025-10-06T08:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.613611 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.614009 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.614155 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.614608 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.615254 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:44Z","lastTransitionTime":"2025-10-06T08:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.718876 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.718971 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.718989 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.719012 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.719028 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:44Z","lastTransitionTime":"2025-10-06T08:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.823144 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.823218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.823237 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.823273 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.823327 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:44Z","lastTransitionTime":"2025-10-06T08:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.926575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.926628 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.926641 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.926661 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:44 crc kubenswrapper[4991]: I1006 08:20:44.926676 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:44Z","lastTransitionTime":"2025-10-06T08:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.029805 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.029858 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.029871 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.029890 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.029904 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:45Z","lastTransitionTime":"2025-10-06T08:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.133682 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.133774 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.133806 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.133833 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.133853 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:45Z","lastTransitionTime":"2025-10-06T08:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.237213 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.237289 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.237345 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.237378 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.237397 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:45Z","lastTransitionTime":"2025-10-06T08:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.243601 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.243743 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:45 crc kubenswrapper[4991]: E1006 08:20:45.243843 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.243891 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:45 crc kubenswrapper[4991]: E1006 08:20:45.244138 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:45 crc kubenswrapper[4991]: E1006 08:20:45.244398 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.341465 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.341550 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.341575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.341609 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.341634 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:45Z","lastTransitionTime":"2025-10-06T08:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.445168 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.445263 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.445289 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.445362 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.445385 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:45Z","lastTransitionTime":"2025-10-06T08:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.556900 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.557009 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.557032 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.557061 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.557083 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:45Z","lastTransitionTime":"2025-10-06T08:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.662859 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.662938 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.662958 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.662992 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.663012 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:45Z","lastTransitionTime":"2025-10-06T08:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.765123 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.765592 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.765652 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.765688 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.765711 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:45Z","lastTransitionTime":"2025-10-06T08:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.868821 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.868877 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.868891 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.868909 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.868925 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:45Z","lastTransitionTime":"2025-10-06T08:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.971714 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.971771 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.971787 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.971810 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:45 crc kubenswrapper[4991]: I1006 08:20:45.971827 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:45Z","lastTransitionTime":"2025-10-06T08:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.074377 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.074545 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.074571 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.074597 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.074614 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:46Z","lastTransitionTime":"2025-10-06T08:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.177640 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.178605 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.178814 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.178977 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.179120 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:46Z","lastTransitionTime":"2025-10-06T08:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.242890 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:46 crc kubenswrapper[4991]: E1006 08:20:46.243472 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.283113 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.283191 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.283214 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.283245 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.283265 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:46Z","lastTransitionTime":"2025-10-06T08:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.386723 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.386798 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.386823 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.386857 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.386881 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:46Z","lastTransitionTime":"2025-10-06T08:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.491105 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.491181 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.491194 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.491216 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.491229 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:46Z","lastTransitionTime":"2025-10-06T08:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.595003 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.595417 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.595528 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.595641 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.595741 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:46Z","lastTransitionTime":"2025-10-06T08:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.699041 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.699575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.699741 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.699906 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.700042 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:46Z","lastTransitionTime":"2025-10-06T08:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.803601 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.803929 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.804063 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.804190 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.804285 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:46Z","lastTransitionTime":"2025-10-06T08:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.908625 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.908669 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.908679 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.908703 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.908716 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:46Z","lastTransitionTime":"2025-10-06T08:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.937145 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.937529 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.937742 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.937959 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.938167 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:46Z","lastTransitionTime":"2025-10-06T08:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:46 crc kubenswrapper[4991]: E1006 08:20:46.960152 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.964557 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.964805 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.964991 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.965201 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.965460 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:46Z","lastTransitionTime":"2025-10-06T08:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:46 crc kubenswrapper[4991]: E1006 08:20:46.987222 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:46Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.992929 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.993001 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.993023 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.993056 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:46 crc kubenswrapper[4991]: I1006 08:20:46.993076 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:46Z","lastTransitionTime":"2025-10-06T08:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:47 crc kubenswrapper[4991]: E1006 08:20:47.014869 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.021073 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.021330 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.021547 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.021747 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.021952 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:47Z","lastTransitionTime":"2025-10-06T08:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:47 crc kubenswrapper[4991]: E1006 08:20:47.044497 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.049730 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.049786 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.049803 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.049829 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.049846 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:47Z","lastTransitionTime":"2025-10-06T08:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:47 crc kubenswrapper[4991]: E1006 08:20:47.066623 4991 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fdc65aba-65bf-4101-b45c-7ba497b89a18\\\",\\\"systemUUID\\\":\\\"a9848c46-d1c6-4335-aa9d-2c0df75a6fc7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:47Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:47 crc kubenswrapper[4991]: E1006 08:20:47.066861 4991 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.069579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.069629 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.069646 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.069673 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.069691 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:47Z","lastTransitionTime":"2025-10-06T08:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.173213 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.173288 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.173349 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.173384 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.173411 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:47Z","lastTransitionTime":"2025-10-06T08:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.243419 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:47 crc kubenswrapper[4991]: E1006 08:20:47.243611 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.243421 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:47 crc kubenswrapper[4991]: E1006 08:20:47.243718 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.243412 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:47 crc kubenswrapper[4991]: E1006 08:20:47.243793 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.281088 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.281432 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.281459 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.281496 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.281523 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:47Z","lastTransitionTime":"2025-10-06T08:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.384911 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.384979 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.384998 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.385024 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.385042 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:47Z","lastTransitionTime":"2025-10-06T08:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.487464 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.487510 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.487521 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.487539 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.487552 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:47Z","lastTransitionTime":"2025-10-06T08:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.591054 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.591121 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.591140 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.591171 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.591214 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:47Z","lastTransitionTime":"2025-10-06T08:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.694589 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.694695 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.694728 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.694751 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.694765 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:47Z","lastTransitionTime":"2025-10-06T08:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.798475 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.798554 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.798574 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.798604 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.798634 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:47Z","lastTransitionTime":"2025-10-06T08:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.902414 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.902894 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.903039 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.903173 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:47 crc kubenswrapper[4991]: I1006 08:20:47.903335 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:47Z","lastTransitionTime":"2025-10-06T08:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.006487 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.006874 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.007015 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.007150 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.007273 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:48Z","lastTransitionTime":"2025-10-06T08:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.110854 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.110937 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.110954 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.110984 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.111002 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:48Z","lastTransitionTime":"2025-10-06T08:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.214659 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.215091 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.215183 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.215278 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.215509 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:48Z","lastTransitionTime":"2025-10-06T08:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.243540 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:48 crc kubenswrapper[4991]: E1006 08:20:48.243716 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.364706 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.364756 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.364767 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.364788 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.364802 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:48Z","lastTransitionTime":"2025-10-06T08:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.467835 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.467876 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.467884 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.467901 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.467912 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:48Z","lastTransitionTime":"2025-10-06T08:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.570944 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.571003 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.571021 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.571047 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.571067 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:48Z","lastTransitionTime":"2025-10-06T08:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.674048 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.674120 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.674145 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.674179 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.674205 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:48Z","lastTransitionTime":"2025-10-06T08:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.776675 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.777059 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.777162 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.777260 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.777364 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:48Z","lastTransitionTime":"2025-10-06T08:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.880395 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.880450 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.880469 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.880494 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.880514 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:48Z","lastTransitionTime":"2025-10-06T08:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.983504 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.983896 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.984039 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.984187 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:48 crc kubenswrapper[4991]: I1006 08:20:48.984356 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:48Z","lastTransitionTime":"2025-10-06T08:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.088509 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.088897 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.088979 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.089064 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.089144 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:49Z","lastTransitionTime":"2025-10-06T08:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.192021 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.192052 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.192060 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.192096 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.192107 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:49Z","lastTransitionTime":"2025-10-06T08:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.243614 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:49 crc kubenswrapper[4991]: E1006 08:20:49.243819 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.244669 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.244763 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:49 crc kubenswrapper[4991]: E1006 08:20:49.244890 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:49 crc kubenswrapper[4991]: E1006 08:20:49.245146 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.269736 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:28Z\\\",\\\"message\\\":\\\"t lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.21\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1006 08:20:28.248472 7200 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-pgn9b\\\\nI1006 08:20:28.248822 7200 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-pgn9b\\\\nF1006 08:20:28.248824 7200 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:20:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cmj9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qwljw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.287831 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"775167a6-c1d2-4436-867f-3cf3e9dedd3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827940c491a640839be62d0dd5e833c73c335fbbf1dc250903f64830f4b9a281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e892a4ab7c2c27cdd7cd3610ab26bc56b0af54ab2652104f5918693f12bc12c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7lwjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t6c85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.294552 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.294623 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.294641 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.294667 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.294686 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:49Z","lastTransitionTime":"2025-10-06T08:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.303649 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f077046-3398-4e00-8196-77a35a5dae86\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93a94513b0d48252d29e34f0894ed101f839dd951aafd28f6b559c0a736fe3aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd6940e91f0573fad020fa28941b5771fc504467b3a2c097c6b72f3fb9e5fe0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://035ebcb00b02d309aa779efc94714f08b124f2608716acbf417ace0c44568c96\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://247ab62d98bacd2b30cf2148dcb02a161defff0489d02b1069e83546de86f93f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.319120 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52809b1f-2590-49ae-a8ee-62cc57f7924b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6f694b7317f0bb32d50eac30a90b58d3aa18e64c27c1705020b95a030cf26b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8adc03dd71b6bcfde3035b71d26a2883f63f4f2eb70a404c5fd27a9d420fb3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://070751bc39916f755da7d98fbc4572031af30e1d45f70bbcafbd24bc6e90a204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c2295f02c0d79ecef0cbb4a8bbb3b74aa745437914461a65aadc05ae35a4b00\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.336047 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55d2305a75ce4624eac7eb0b1fb6cf4172c90faeba7e5b78ee9f05ab465686a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d482b9f31e18bfe002085c5f30c6a672d7bce44622c318257f0143f31525d4c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.352680 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scqml" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c92a7298-0ed4-4956-98d8-8eb78df3f1e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://546d2288f468567b68158ba1b8a7c7287b0db8eb1bf52a38493b55903d91f94a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4kzd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:33Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scqml\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.369901 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjvmw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58386a1a-6047-42ce-a952-43f397822919\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T08:20:23Z\\\",\\\"message\\\":\\\"2025-10-06T08:19:37+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd\\\\n2025-10-06T08:19:37+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d8fd4b00-cb8b-4c5e-a732-860ebfa34ffd to /host/opt/cni/bin/\\\\n2025-10-06T08:19:38Z [verbose] multus-daemon started\\\\n2025-10-06T08:19:38Z [verbose] Readiness Indicator file check\\\\n2025-10-06T08:20:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xzc78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjvmw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.383952 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65471d7d-65b6-49ce-90be-171db9b3cb42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fc63ce4566a60a046660ba2cd36341359322d391761ce390658bfdbb24c1a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g7p92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wpb6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.397774 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.397828 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.397843 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.397861 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.397874 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:49Z","lastTransitionTime":"2025-10-06T08:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.403901 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"881045ce-f2cf-41d3-a315-eec70d0ed97d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8ea734f4444cdd1897b10dffc7a0d18c5d3c66d5ffb7c654ff315dde10e0202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d6effde441b3bf7083fdee89bc3ec6f8c131dc5468a5e149cb2d1874efa8ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa90d0f85b5d0c00230be689998d0d3e2bbc0cd86154a6107c1758fb36c1aae9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34ddd6a5d7d55821c945c133ed6260744ec4ff207e5f70ea604668e14dc08f22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca69816d0c10b4137e5970491ce9c576733f9047538750ebe381d1877ba44d81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac2a3363d542c1127db434e96bbabfca1dc63898177e3cfb9f8b76124fc89705\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5247f7866b1db1027c0af6740d6a37df46ff96d4eea02a9d0bfce6bf6e8c4f41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bc2xh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:34Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pgn9b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.416692 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-787zw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e38e446-d0d7-463a-987a-110a8e95fe84\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dggwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:48Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-787zw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.427550 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e5c759-8037-476e-9cb0-d31a36cbbde6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fae28e1f9e34b6670b19842581b89981626f77f1e3cec07a7c9a4610557c86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d125182810217335e9e760bad80f33e4018c631aaf4dfc1374950a888102ca6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d125182810217335e9e760bad80f33e4018c631aaf4dfc1374950a888102ca6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.447601 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4f066b5-4bd5-492e-acef-c6bf1fa17e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://446e2000e25f980c1f6a46fa65559f496a20f04cf1d589fe9ea0c1a9adf7f383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53bbb3bde179c7125181926f11b6d50f09f516e66a619f84fe2372c30f0a1ef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf4570a167a135c47f723dd0173e91097dac4efff5278e9c427d1079370aea3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d52db8e7f85235be39aaad0dfa8b9d901a431267926e7fac80c96b451f2ca75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43a4d943eb3eb6234e54c9fbfd2ab9540254a676de8feae8f8c922bcce2d10b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://354e12a52556c30efdd621f1eda21ef3a6850c1dc1ee9b257934c5e3a0016fde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c4a7bf2406259240ec62b0e383ef7937b0cc5922e6d414a9522ee46650a5adf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c761c23a791dc4b3ddbe83b4fbfae9805c985ce8c53ee04d4735240e0a15f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.462480 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.475705 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.487617 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bjjz6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"270ca557-afe0-4918-b9b9-0beae133a293\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62614a7da276b894a1ccab45b4f60e7ad28ccb3a928ca417764e28bc9436a160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g4m5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:36Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bjjz6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.501359 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.501416 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.501430 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.501450 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.501485 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:49Z","lastTransitionTime":"2025-10-06T08:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.502557 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d10d99-6365-4aaf-9a31-40b0379f039d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:20:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://391716f9baee9ce46ad9e2c3246cc77c1e0b54807ba167e965b731f923324257\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96b0b4151117767f443261ca938df05842f35133ed3d9aa5786b9eca4b05b5b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b1de253c64368fc2dc89b876fb8c5c0c24f6f3ef92e1def215ce81391c84c08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcc070c8df1b2b6b99931484f525e26101e7c2e6a23544db365ab6fc066f3ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9569e34394c62ea1c975a683d95d40051c756b8650514d0f67392fc117ddf7f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T08:19:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1006 08:19:28.597131 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 08:19:28.597274 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 08:19:28.598316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3487499230/tls.crt::/tmp/serving-cert-3487499230/tls.key\\\\\\\"\\\\nI1006 08:19:28.889281 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 08:19:28.891849 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 08:19:28.891868 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 08:19:28.891894 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 08:19:28.891899 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 08:19:28.899749 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 08:19:28.899783 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899790 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 08:19:28.899798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 08:19:28.899803 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1006 08:19:28.899807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 08:19:28.899813 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1006 08:19:28.899803 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1006 08:19:28.902850 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5ae180849801cf19716ff4fe2e2714f2ed4ad300634a81b2fb8a517dbe9af7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a418fe160e11a1ec92233f74a971375e290c2ae720f2be15e0e2b13345bc628e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T08:19:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T08:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T08:19:09Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.519705 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fe2d9245c9842e2ece20e30b34266332d2784ffe34097efea9c0788db3b0d6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.533790 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.550002 4991 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T08:19:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb93ca5cb502c46414ea24dd798c1f74d3bd2dffd5b3b0e584a91df04bc3f07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T08:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T08:20:49Z is after 2025-08-24T17:21:41Z" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.605159 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.605229 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.605247 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.605276 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.605330 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:49Z","lastTransitionTime":"2025-10-06T08:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.708648 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.708703 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.708715 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.708734 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.708747 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:49Z","lastTransitionTime":"2025-10-06T08:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.812053 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.812104 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.812121 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.812146 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.812163 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:49Z","lastTransitionTime":"2025-10-06T08:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.915722 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.915773 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.915792 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.915820 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:49 crc kubenswrapper[4991]: I1006 08:20:49.915838 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:49Z","lastTransitionTime":"2025-10-06T08:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.018911 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.018963 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.018980 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.019000 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.019015 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:50Z","lastTransitionTime":"2025-10-06T08:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.122559 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.122951 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.123163 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.123408 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.123687 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:50Z","lastTransitionTime":"2025-10-06T08:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.227406 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.227458 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.227466 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.227483 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.227497 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:50Z","lastTransitionTime":"2025-10-06T08:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.243634 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:50 crc kubenswrapper[4991]: E1006 08:20:50.243888 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.330936 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.331162 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.331192 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.331224 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.331246 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:50Z","lastTransitionTime":"2025-10-06T08:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.435167 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.435229 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.435242 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.435265 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.435279 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:50Z","lastTransitionTime":"2025-10-06T08:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.538816 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.538903 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.538923 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.538956 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.538975 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:50Z","lastTransitionTime":"2025-10-06T08:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.641829 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.641923 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.641947 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.641983 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.642001 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:50Z","lastTransitionTime":"2025-10-06T08:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.745457 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.745603 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.745627 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.745656 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.745676 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:50Z","lastTransitionTime":"2025-10-06T08:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.848959 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.849047 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.849064 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.849093 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.849112 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:50Z","lastTransitionTime":"2025-10-06T08:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.952761 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.952855 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.952877 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.952904 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:50 crc kubenswrapper[4991]: I1006 08:20:50.952923 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:50Z","lastTransitionTime":"2025-10-06T08:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.056975 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.057045 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.057062 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.057089 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.057107 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:51Z","lastTransitionTime":"2025-10-06T08:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.160743 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.160813 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.160831 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.160859 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.160877 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:51Z","lastTransitionTime":"2025-10-06T08:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.244582 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.244624 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:51 crc kubenswrapper[4991]: E1006 08:20:51.244844 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:51 crc kubenswrapper[4991]: E1006 08:20:51.245017 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.245379 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:51 crc kubenswrapper[4991]: E1006 08:20:51.245509 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.263531 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.263585 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.263600 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.263624 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.263643 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:51Z","lastTransitionTime":"2025-10-06T08:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.366342 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.366405 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.366424 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.366453 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.366468 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:51Z","lastTransitionTime":"2025-10-06T08:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.469488 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.469533 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.469542 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.469558 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.469568 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:51Z","lastTransitionTime":"2025-10-06T08:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.572263 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.572392 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.572421 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.572455 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.572478 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:51Z","lastTransitionTime":"2025-10-06T08:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.675612 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.675675 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.675696 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.675725 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.675746 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:51Z","lastTransitionTime":"2025-10-06T08:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.779091 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.779162 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.779185 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.779216 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.779243 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:51Z","lastTransitionTime":"2025-10-06T08:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.881654 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.881740 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.881766 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.881800 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.881825 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:51Z","lastTransitionTime":"2025-10-06T08:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.985853 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.985948 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.985973 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.986008 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:51 crc kubenswrapper[4991]: I1006 08:20:51.986032 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:51Z","lastTransitionTime":"2025-10-06T08:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.089288 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.089379 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.089403 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.089434 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.089451 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:52Z","lastTransitionTime":"2025-10-06T08:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.192786 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.192836 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.192847 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.192864 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.192875 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:52Z","lastTransitionTime":"2025-10-06T08:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.243340 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:52 crc kubenswrapper[4991]: E1006 08:20:52.243680 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.296672 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.296734 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.296748 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.296768 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.296787 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:52Z","lastTransitionTime":"2025-10-06T08:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.399226 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.399342 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.399364 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.399393 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.399412 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:52Z","lastTransitionTime":"2025-10-06T08:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.502831 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.502913 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.502931 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.502964 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.502984 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:52Z","lastTransitionTime":"2025-10-06T08:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.511233 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:52 crc kubenswrapper[4991]: E1006 08:20:52.511451 4991 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:20:52 crc kubenswrapper[4991]: E1006 08:20:52.511573 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs podName:3e38e446-d0d7-463a-987a-110a8e95fe84 nodeName:}" failed. No retries permitted until 2025-10-06 08:21:56.51153597 +0000 UTC m=+168.249286031 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs") pod "network-metrics-daemon-787zw" (UID: "3e38e446-d0d7-463a-987a-110a8e95fe84") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.606659 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.606737 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.606753 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.606781 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.606800 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:52Z","lastTransitionTime":"2025-10-06T08:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.710675 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.710776 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.710812 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.710845 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.710863 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:52Z","lastTransitionTime":"2025-10-06T08:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.815738 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.815810 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.815836 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.815868 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.815891 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:52Z","lastTransitionTime":"2025-10-06T08:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.919714 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.919789 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.919809 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.919837 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:52 crc kubenswrapper[4991]: I1006 08:20:52.919858 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:52Z","lastTransitionTime":"2025-10-06T08:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.023121 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.023174 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.023186 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.023204 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.023218 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:53Z","lastTransitionTime":"2025-10-06T08:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.127386 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.127453 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.127470 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.127499 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.127519 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:53Z","lastTransitionTime":"2025-10-06T08:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.231247 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.231353 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.231366 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.231388 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.231402 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:53Z","lastTransitionTime":"2025-10-06T08:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.243081 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.243166 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:53 crc kubenswrapper[4991]: E1006 08:20:53.243363 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.243465 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:53 crc kubenswrapper[4991]: E1006 08:20:53.243632 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:53 crc kubenswrapper[4991]: E1006 08:20:53.243867 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.333939 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.334112 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.334137 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.334170 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.334195 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:53Z","lastTransitionTime":"2025-10-06T08:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.437790 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.437868 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.437888 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.437916 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.437934 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:53Z","lastTransitionTime":"2025-10-06T08:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.541046 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.541103 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.541122 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.541149 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.541169 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:53Z","lastTransitionTime":"2025-10-06T08:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.643683 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.643742 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.643764 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.643794 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.643819 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:53Z","lastTransitionTime":"2025-10-06T08:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.747041 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.747104 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.747121 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.747152 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.747170 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:53Z","lastTransitionTime":"2025-10-06T08:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.850955 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.851031 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.851050 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.851077 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.851098 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:53Z","lastTransitionTime":"2025-10-06T08:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.953886 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.953961 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.953986 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.954021 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:53 crc kubenswrapper[4991]: I1006 08:20:53.954046 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:53Z","lastTransitionTime":"2025-10-06T08:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.057666 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.057769 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.057792 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.057819 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.057839 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:54Z","lastTransitionTime":"2025-10-06T08:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.161994 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.162046 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.162056 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.162075 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.162086 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:54Z","lastTransitionTime":"2025-10-06T08:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.243648 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:54 crc kubenswrapper[4991]: E1006 08:20:54.243854 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.265779 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.265871 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.265890 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.265916 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.265933 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:54Z","lastTransitionTime":"2025-10-06T08:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.369015 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.369068 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.369081 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.369101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.369113 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:54Z","lastTransitionTime":"2025-10-06T08:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.472623 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.472693 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.472711 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.472738 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.472758 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:54Z","lastTransitionTime":"2025-10-06T08:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.576740 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.576821 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.576838 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.576866 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.576886 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:54Z","lastTransitionTime":"2025-10-06T08:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.680232 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.680338 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.680363 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.680392 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.680409 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:54Z","lastTransitionTime":"2025-10-06T08:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.784278 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.784447 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.784472 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.784542 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.784565 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:54Z","lastTransitionTime":"2025-10-06T08:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.888824 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.888901 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.888922 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.888951 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.888972 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:54Z","lastTransitionTime":"2025-10-06T08:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.991481 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.991546 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.991558 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.991579 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:54 crc kubenswrapper[4991]: I1006 08:20:54.991591 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:54Z","lastTransitionTime":"2025-10-06T08:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.095428 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.095484 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.095493 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.095513 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.095523 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:55Z","lastTransitionTime":"2025-10-06T08:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.199058 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.199106 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.199118 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.199139 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.199153 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:55Z","lastTransitionTime":"2025-10-06T08:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.243065 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.243106 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:55 crc kubenswrapper[4991]: E1006 08:20:55.243366 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.243446 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:55 crc kubenswrapper[4991]: E1006 08:20:55.243683 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:55 crc kubenswrapper[4991]: E1006 08:20:55.243855 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.302368 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.302456 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.302481 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.302518 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.302555 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:55Z","lastTransitionTime":"2025-10-06T08:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.406101 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.406165 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.406184 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.406218 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.406240 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:55Z","lastTransitionTime":"2025-10-06T08:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.509887 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.509938 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.509961 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.509986 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.510003 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:55Z","lastTransitionTime":"2025-10-06T08:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.612926 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.613001 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.613023 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.613087 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.613123 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:55Z","lastTransitionTime":"2025-10-06T08:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.717748 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.717929 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.717949 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.718082 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.718107 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:55Z","lastTransitionTime":"2025-10-06T08:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.821798 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.821855 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.821872 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.821896 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.821913 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:55Z","lastTransitionTime":"2025-10-06T08:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.925868 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.925935 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.925947 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.925967 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:55 crc kubenswrapper[4991]: I1006 08:20:55.925983 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:55Z","lastTransitionTime":"2025-10-06T08:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.029575 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.029619 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.029633 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.029655 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.029719 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:56Z","lastTransitionTime":"2025-10-06T08:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.134241 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.134285 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.134318 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.134339 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.134413 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:56Z","lastTransitionTime":"2025-10-06T08:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.237996 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.238072 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.238088 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.238115 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.238129 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:56Z","lastTransitionTime":"2025-10-06T08:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.243373 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:56 crc kubenswrapper[4991]: E1006 08:20:56.244386 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.245148 4991 scope.go:117] "RemoveContainer" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" Oct 06 08:20:56 crc kubenswrapper[4991]: E1006 08:20:56.245555 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.342021 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.342113 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.342137 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.342177 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.342201 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:56Z","lastTransitionTime":"2025-10-06T08:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.445687 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.445781 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.445804 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.445831 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.445852 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:56Z","lastTransitionTime":"2025-10-06T08:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.549385 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.549456 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.549475 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.549507 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.549529 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:56Z","lastTransitionTime":"2025-10-06T08:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.653049 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.653142 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.653162 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.653191 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.653212 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:56Z","lastTransitionTime":"2025-10-06T08:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.756488 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.756557 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.756574 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.756607 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.756660 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:56Z","lastTransitionTime":"2025-10-06T08:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.860823 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.860907 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.860928 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.860958 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.860980 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:56Z","lastTransitionTime":"2025-10-06T08:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.964642 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.964715 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.964735 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.964762 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:56 crc kubenswrapper[4991]: I1006 08:20:56.964782 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:56Z","lastTransitionTime":"2025-10-06T08:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.069153 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.069242 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.069271 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.069339 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.069367 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:57Z","lastTransitionTime":"2025-10-06T08:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.172488 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.172572 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.172584 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.172603 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.172617 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:57Z","lastTransitionTime":"2025-10-06T08:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.243657 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:57 crc kubenswrapper[4991]: E1006 08:20:57.243888 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.244611 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.244627 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:57 crc kubenswrapper[4991]: E1006 08:20:57.244742 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:57 crc kubenswrapper[4991]: E1006 08:20:57.244998 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.274993 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.275046 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.275060 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.275080 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.275096 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:57Z","lastTransitionTime":"2025-10-06T08:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.378380 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.378437 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.378456 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.378482 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.378502 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:57Z","lastTransitionTime":"2025-10-06T08:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.430383 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.430460 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.430486 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.430521 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.430545 4991 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T08:20:57Z","lastTransitionTime":"2025-10-06T08:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.506421 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg"] Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.507023 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.509513 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.510209 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.510226 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.510738 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.623495 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=57.62347269 podStartE2EDuration="57.62347269s" podCreationTimestamp="2025-10-06 08:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:20:57.623005237 +0000 UTC m=+109.360755318" watchObservedRunningTime="2025-10-06 08:20:57.62347269 +0000 UTC m=+109.361222711" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.623767 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.623762139 podStartE2EDuration="1m28.623762139s" podCreationTimestamp="2025-10-06 08:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:20:57.604426447 +0000 UTC m=+109.342176528" watchObservedRunningTime="2025-10-06 08:20:57.623762139 +0000 UTC m=+109.361512160" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.676009 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.676098 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.676152 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.676218 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.676420 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.686932 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-scqml" podStartSLOduration=84.686902469 podStartE2EDuration="1m24.686902469s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:20:57.657016667 +0000 UTC m=+109.394766738" watchObservedRunningTime="2025-10-06 08:20:57.686902469 +0000 UTC m=+109.424652500" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.736412 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xjvmw" podStartSLOduration=84.736355879 podStartE2EDuration="1m24.736355879s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:20:57.688012381 +0000 UTC m=+109.425762422" watchObservedRunningTime="2025-10-06 08:20:57.736355879 +0000 UTC m=+109.474105920" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.754999 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t6c85" podStartSLOduration=83.75498264 podStartE2EDuration="1m23.75498264s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:20:57.754715063 +0000 UTC m=+109.492465084" watchObservedRunningTime="2025-10-06 08:20:57.75498264 +0000 UTC m=+109.492732661" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.770187 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=83.770163794 podStartE2EDuration="1m23.770163794s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:20:57.76970779 +0000 UTC m=+109.507457811" watchObservedRunningTime="2025-10-06 08:20:57.770163794 +0000 UTC m=+109.507913815" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.777464 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.777527 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.777594 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.777656 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.777687 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.777748 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.777759 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.778607 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.784714 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.794922 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xg9mg\" (UID: \"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.805100 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pgn9b" podStartSLOduration=84.805077039 podStartE2EDuration="1m24.805077039s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:20:57.786726685 +0000 UTC m=+109.524476706" watchObservedRunningTime="2025-10-06 08:20:57.805077039 +0000 UTC m=+109.542827060" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.819998 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podStartSLOduration=84.819961803 podStartE2EDuration="1m24.819961803s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:20:57.819184031 +0000 UTC m=+109.556934052" watchObservedRunningTime="2025-10-06 08:20:57.819961803 +0000 UTC m=+109.557711824" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.828746 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.859549 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=89.859526931 podStartE2EDuration="1m29.859526931s" podCreationTimestamp="2025-10-06 08:19:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:20:57.857468683 +0000 UTC m=+109.595218694" watchObservedRunningTime="2025-10-06 08:20:57.859526931 +0000 UTC m=+109.597276952" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.905743 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" event={"ID":"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce","Type":"ContainerStarted","Data":"047e01d6b13aee89ff835c42dd1fdbf6de230d5277df2d12fa81bc69ee37b654"} Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.911060 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bjjz6" podStartSLOduration=84.91103599 podStartE2EDuration="1m24.91103599s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:20:57.908847148 +0000 UTC m=+109.646597169" watchObservedRunningTime="2025-10-06 08:20:57.91103599 +0000 UTC m=+109.648786011" Oct 06 08:20:57 crc kubenswrapper[4991]: I1006 08:20:57.943774 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.943750722 podStartE2EDuration="29.943750722s" podCreationTimestamp="2025-10-06 08:20:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:20:57.942873188 +0000 UTC m=+109.680623219" watchObservedRunningTime="2025-10-06 08:20:57.943750722 +0000 UTC m=+109.681500743" Oct 06 08:20:58 crc kubenswrapper[4991]: I1006 08:20:58.243842 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:20:58 crc kubenswrapper[4991]: E1006 08:20:58.244240 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:20:58 crc kubenswrapper[4991]: I1006 08:20:58.912208 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" event={"ID":"6b4b80bf-f51a-4cfb-83be-5e1ea80c64ce","Type":"ContainerStarted","Data":"6e3069f9c81e079c9f2646d66fafbd0da560eda8d3a5cddae3f17573c47f16bf"} Oct 06 08:20:58 crc kubenswrapper[4991]: I1006 08:20:58.931454 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xg9mg" podStartSLOduration=85.931432515 podStartE2EDuration="1m25.931432515s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:20:58.93091836 +0000 UTC m=+110.668668431" watchObservedRunningTime="2025-10-06 08:20:58.931432515 +0000 UTC m=+110.669182536" Oct 06 08:20:59 crc kubenswrapper[4991]: I1006 08:20:59.243607 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:20:59 crc kubenswrapper[4991]: E1006 08:20:59.244932 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:20:59 crc kubenswrapper[4991]: I1006 08:20:59.245054 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:20:59 crc kubenswrapper[4991]: I1006 08:20:59.245062 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:20:59 crc kubenswrapper[4991]: E1006 08:20:59.245170 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:20:59 crc kubenswrapper[4991]: E1006 08:20:59.245370 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:00 crc kubenswrapper[4991]: I1006 08:21:00.243615 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:00 crc kubenswrapper[4991]: E1006 08:21:00.243873 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:01 crc kubenswrapper[4991]: I1006 08:21:01.243559 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:01 crc kubenswrapper[4991]: I1006 08:21:01.243605 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:01 crc kubenswrapper[4991]: I1006 08:21:01.243605 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:01 crc kubenswrapper[4991]: E1006 08:21:01.243762 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:01 crc kubenswrapper[4991]: E1006 08:21:01.243904 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:01 crc kubenswrapper[4991]: E1006 08:21:01.244030 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:02 crc kubenswrapper[4991]: I1006 08:21:02.243221 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:02 crc kubenswrapper[4991]: E1006 08:21:02.243720 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:03 crc kubenswrapper[4991]: I1006 08:21:03.243546 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:03 crc kubenswrapper[4991]: I1006 08:21:03.243546 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:03 crc kubenswrapper[4991]: I1006 08:21:03.243673 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:03 crc kubenswrapper[4991]: E1006 08:21:03.243780 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:03 crc kubenswrapper[4991]: E1006 08:21:03.244045 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:03 crc kubenswrapper[4991]: E1006 08:21:03.244204 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:04 crc kubenswrapper[4991]: I1006 08:21:04.243574 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:04 crc kubenswrapper[4991]: E1006 08:21:04.244627 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:05 crc kubenswrapper[4991]: I1006 08:21:05.243410 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:05 crc kubenswrapper[4991]: I1006 08:21:05.243432 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:05 crc kubenswrapper[4991]: E1006 08:21:05.244276 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:05 crc kubenswrapper[4991]: I1006 08:21:05.243564 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:05 crc kubenswrapper[4991]: E1006 08:21:05.244451 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:05 crc kubenswrapper[4991]: E1006 08:21:05.245192 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:06 crc kubenswrapper[4991]: I1006 08:21:06.243218 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:06 crc kubenswrapper[4991]: E1006 08:21:06.243510 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:07 crc kubenswrapper[4991]: I1006 08:21:07.243772 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:07 crc kubenswrapper[4991]: I1006 08:21:07.243861 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:07 crc kubenswrapper[4991]: E1006 08:21:07.243931 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:07 crc kubenswrapper[4991]: I1006 08:21:07.244029 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:07 crc kubenswrapper[4991]: E1006 08:21:07.244216 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:07 crc kubenswrapper[4991]: E1006 08:21:07.244385 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:08 crc kubenswrapper[4991]: I1006 08:21:08.242958 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:08 crc kubenswrapper[4991]: E1006 08:21:08.243224 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:08 crc kubenswrapper[4991]: I1006 08:21:08.244129 4991 scope.go:117] "RemoveContainer" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" Oct 06 08:21:08 crc kubenswrapper[4991]: E1006 08:21:08.244900 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qwljw_openshift-ovn-kubernetes(977b0faa-5b3d-4e9d-bef4-ba47f8764c6e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" Oct 06 08:21:09 crc kubenswrapper[4991]: E1006 08:21:09.198034 4991 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 06 08:21:09 crc kubenswrapper[4991]: I1006 08:21:09.243623 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:09 crc kubenswrapper[4991]: I1006 08:21:09.244113 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:09 crc kubenswrapper[4991]: E1006 08:21:09.244866 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:09 crc kubenswrapper[4991]: I1006 08:21:09.244925 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:09 crc kubenswrapper[4991]: E1006 08:21:09.245033 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:09 crc kubenswrapper[4991]: E1006 08:21:09.245157 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:09 crc kubenswrapper[4991]: E1006 08:21:09.340111 4991 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 08:21:09 crc kubenswrapper[4991]: I1006 08:21:09.956166 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjvmw_58386a1a-6047-42ce-a952-43f397822919/kube-multus/1.log" Oct 06 08:21:09 crc kubenswrapper[4991]: I1006 08:21:09.957093 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjvmw_58386a1a-6047-42ce-a952-43f397822919/kube-multus/0.log" Oct 06 08:21:09 crc kubenswrapper[4991]: I1006 08:21:09.957171 4991 generic.go:334] "Generic (PLEG): container finished" podID="58386a1a-6047-42ce-a952-43f397822919" containerID="e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771" exitCode=1 Oct 06 08:21:09 crc kubenswrapper[4991]: I1006 08:21:09.957232 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjvmw" event={"ID":"58386a1a-6047-42ce-a952-43f397822919","Type":"ContainerDied","Data":"e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771"} Oct 06 08:21:09 crc kubenswrapper[4991]: I1006 08:21:09.957361 4991 scope.go:117] "RemoveContainer" containerID="688ab716efc3f5048086ffd9712d3623248863c9fa472a07a76b6d144d2bc793" Oct 06 08:21:09 crc kubenswrapper[4991]: I1006 08:21:09.958077 4991 scope.go:117] "RemoveContainer" containerID="e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771" Oct 06 08:21:09 crc kubenswrapper[4991]: E1006 08:21:09.958423 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xjvmw_openshift-multus(58386a1a-6047-42ce-a952-43f397822919)\"" pod="openshift-multus/multus-xjvmw" podUID="58386a1a-6047-42ce-a952-43f397822919" Oct 06 08:21:10 crc kubenswrapper[4991]: I1006 08:21:10.242768 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:10 crc kubenswrapper[4991]: E1006 08:21:10.242924 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:10 crc kubenswrapper[4991]: I1006 08:21:10.964049 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjvmw_58386a1a-6047-42ce-a952-43f397822919/kube-multus/1.log" Oct 06 08:21:11 crc kubenswrapper[4991]: I1006 08:21:11.243460 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:11 crc kubenswrapper[4991]: I1006 08:21:11.243513 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:11 crc kubenswrapper[4991]: I1006 08:21:11.243481 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:11 crc kubenswrapper[4991]: E1006 08:21:11.243664 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:11 crc kubenswrapper[4991]: E1006 08:21:11.243837 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:11 crc kubenswrapper[4991]: E1006 08:21:11.244096 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:12 crc kubenswrapper[4991]: I1006 08:21:12.243224 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:12 crc kubenswrapper[4991]: E1006 08:21:12.243467 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:13 crc kubenswrapper[4991]: I1006 08:21:13.243279 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:13 crc kubenswrapper[4991]: I1006 08:21:13.243418 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:13 crc kubenswrapper[4991]: I1006 08:21:13.243431 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:13 crc kubenswrapper[4991]: E1006 08:21:13.243545 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:13 crc kubenswrapper[4991]: E1006 08:21:13.243666 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:13 crc kubenswrapper[4991]: E1006 08:21:13.243880 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:14 crc kubenswrapper[4991]: I1006 08:21:14.243633 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:14 crc kubenswrapper[4991]: E1006 08:21:14.243853 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:14 crc kubenswrapper[4991]: E1006 08:21:14.341803 4991 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 08:21:15 crc kubenswrapper[4991]: I1006 08:21:15.243374 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:15 crc kubenswrapper[4991]: I1006 08:21:15.243432 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:15 crc kubenswrapper[4991]: I1006 08:21:15.243432 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:15 crc kubenswrapper[4991]: E1006 08:21:15.244200 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:15 crc kubenswrapper[4991]: E1006 08:21:15.244325 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:15 crc kubenswrapper[4991]: E1006 08:21:15.245058 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:16 crc kubenswrapper[4991]: I1006 08:21:16.242908 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:16 crc kubenswrapper[4991]: E1006 08:21:16.243417 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:17 crc kubenswrapper[4991]: I1006 08:21:17.243394 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:17 crc kubenswrapper[4991]: I1006 08:21:17.243537 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:17 crc kubenswrapper[4991]: I1006 08:21:17.244568 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:17 crc kubenswrapper[4991]: E1006 08:21:17.244844 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:17 crc kubenswrapper[4991]: E1006 08:21:17.245055 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:17 crc kubenswrapper[4991]: E1006 08:21:17.245136 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:18 crc kubenswrapper[4991]: I1006 08:21:18.243106 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:18 crc kubenswrapper[4991]: E1006 08:21:18.243598 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:19 crc kubenswrapper[4991]: I1006 08:21:19.242936 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:19 crc kubenswrapper[4991]: I1006 08:21:19.243012 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:19 crc kubenswrapper[4991]: E1006 08:21:19.244904 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:19 crc kubenswrapper[4991]: I1006 08:21:19.244968 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:19 crc kubenswrapper[4991]: E1006 08:21:19.245192 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:19 crc kubenswrapper[4991]: E1006 08:21:19.245266 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:19 crc kubenswrapper[4991]: E1006 08:21:19.342773 4991 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 08:21:20 crc kubenswrapper[4991]: I1006 08:21:20.243645 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:20 crc kubenswrapper[4991]: E1006 08:21:20.243829 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:21 crc kubenswrapper[4991]: I1006 08:21:21.244573 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:21 crc kubenswrapper[4991]: I1006 08:21:21.244652 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:21 crc kubenswrapper[4991]: E1006 08:21:21.244824 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:21 crc kubenswrapper[4991]: I1006 08:21:21.244888 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:21 crc kubenswrapper[4991]: E1006 08:21:21.245349 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:21 crc kubenswrapper[4991]: E1006 08:21:21.245530 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:22 crc kubenswrapper[4991]: I1006 08:21:22.243603 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:22 crc kubenswrapper[4991]: E1006 08:21:22.243832 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:23 crc kubenswrapper[4991]: I1006 08:21:23.243722 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:23 crc kubenswrapper[4991]: I1006 08:21:23.243722 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:23 crc kubenswrapper[4991]: I1006 08:21:23.243756 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:23 crc kubenswrapper[4991]: E1006 08:21:23.244311 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:23 crc kubenswrapper[4991]: E1006 08:21:23.244415 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:23 crc kubenswrapper[4991]: I1006 08:21:23.244623 4991 scope.go:117] "RemoveContainer" containerID="e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771" Oct 06 08:21:23 crc kubenswrapper[4991]: E1006 08:21:23.247580 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:23 crc kubenswrapper[4991]: I1006 08:21:23.248256 4991 scope.go:117] "RemoveContainer" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" Oct 06 08:21:24 crc kubenswrapper[4991]: I1006 08:21:24.017888 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/3.log" Oct 06 08:21:24 crc kubenswrapper[4991]: I1006 08:21:24.021897 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerStarted","Data":"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb"} Oct 06 08:21:24 crc kubenswrapper[4991]: I1006 08:21:24.022443 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:21:24 crc kubenswrapper[4991]: I1006 08:21:24.024111 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjvmw_58386a1a-6047-42ce-a952-43f397822919/kube-multus/1.log" Oct 06 08:21:24 crc kubenswrapper[4991]: I1006 08:21:24.024185 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjvmw" event={"ID":"58386a1a-6047-42ce-a952-43f397822919","Type":"ContainerStarted","Data":"9b6902fadf422e50276f1e9aed20f9eb81e712f105467693407490a695638a3f"} Oct 06 08:21:24 crc kubenswrapper[4991]: I1006 08:21:24.063211 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podStartSLOduration=110.063178583 podStartE2EDuration="1m50.063178583s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:24.062129424 +0000 UTC m=+135.799879465" watchObservedRunningTime="2025-10-06 08:21:24.063178583 +0000 UTC m=+135.800928614" Oct 06 08:21:24 crc kubenswrapper[4991]: I1006 08:21:24.243485 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:24 crc kubenswrapper[4991]: E1006 08:21:24.243633 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:24 crc kubenswrapper[4991]: I1006 08:21:24.337362 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-787zw"] Oct 06 08:21:24 crc kubenswrapper[4991]: E1006 08:21:24.344224 4991 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 08:21:25 crc kubenswrapper[4991]: I1006 08:21:25.028449 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:25 crc kubenswrapper[4991]: E1006 08:21:25.029341 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:25 crc kubenswrapper[4991]: I1006 08:21:25.243468 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:25 crc kubenswrapper[4991]: I1006 08:21:25.243628 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:25 crc kubenswrapper[4991]: E1006 08:21:25.243711 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:25 crc kubenswrapper[4991]: I1006 08:21:25.243754 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:25 crc kubenswrapper[4991]: E1006 08:21:25.243847 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:25 crc kubenswrapper[4991]: E1006 08:21:25.243976 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:27 crc kubenswrapper[4991]: I1006 08:21:27.243638 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:27 crc kubenswrapper[4991]: I1006 08:21:27.243798 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:27 crc kubenswrapper[4991]: I1006 08:21:27.243910 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:27 crc kubenswrapper[4991]: E1006 08:21:27.243891 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:27 crc kubenswrapper[4991]: I1006 08:21:27.243989 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:27 crc kubenswrapper[4991]: E1006 08:21:27.244213 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:27 crc kubenswrapper[4991]: E1006 08:21:27.244647 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:27 crc kubenswrapper[4991]: E1006 08:21:27.244888 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:29 crc kubenswrapper[4991]: I1006 08:21:29.243825 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:29 crc kubenswrapper[4991]: I1006 08:21:29.243853 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:29 crc kubenswrapper[4991]: I1006 08:21:29.243887 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:29 crc kubenswrapper[4991]: I1006 08:21:29.244013 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:29 crc kubenswrapper[4991]: E1006 08:21:29.246097 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 08:21:29 crc kubenswrapper[4991]: E1006 08:21:29.246223 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 08:21:29 crc kubenswrapper[4991]: E1006 08:21:29.246441 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-787zw" podUID="3e38e446-d0d7-463a-987a-110a8e95fe84" Oct 06 08:21:29 crc kubenswrapper[4991]: E1006 08:21:29.246617 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 08:21:31 crc kubenswrapper[4991]: I1006 08:21:31.243767 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:31 crc kubenswrapper[4991]: I1006 08:21:31.243866 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:31 crc kubenswrapper[4991]: I1006 08:21:31.243932 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:31 crc kubenswrapper[4991]: I1006 08:21:31.244083 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:31 crc kubenswrapper[4991]: I1006 08:21:31.247835 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 06 08:21:31 crc kubenswrapper[4991]: I1006 08:21:31.247884 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 06 08:21:31 crc kubenswrapper[4991]: I1006 08:21:31.247894 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 06 08:21:31 crc kubenswrapper[4991]: I1006 08:21:31.248250 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 06 08:21:31 crc kubenswrapper[4991]: I1006 08:21:31.248256 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 06 08:21:31 crc kubenswrapper[4991]: I1006 08:21:31.248673 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 06 08:21:34 crc kubenswrapper[4991]: I1006 08:21:34.918869 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:21:37 crc kubenswrapper[4991]: I1006 08:21:37.164165 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:37 crc kubenswrapper[4991]: I1006 08:21:37.164371 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:37 crc kubenswrapper[4991]: E1006 08:21:37.164444 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:23:39.164394408 +0000 UTC m=+270.902144449 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:37 crc kubenswrapper[4991]: I1006 08:21:37.164562 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:37 crc kubenswrapper[4991]: I1006 08:21:37.164642 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:37 crc kubenswrapper[4991]: I1006 08:21:37.164689 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:37 crc kubenswrapper[4991]: I1006 08:21:37.165643 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:37 crc kubenswrapper[4991]: I1006 08:21:37.172149 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:37 crc kubenswrapper[4991]: I1006 08:21:37.172528 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:37 crc kubenswrapper[4991]: I1006 08:21:37.173791 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:37 crc kubenswrapper[4991]: I1006 08:21:37.288344 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 08:21:37 crc kubenswrapper[4991]: I1006 08:21:37.303362 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:37 crc kubenswrapper[4991]: I1006 08:21:37.316467 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 08:21:37 crc kubenswrapper[4991]: W1006 08:21:37.537883 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-6527c52f3e2511303f9a4b73725102495924ad1447a69bc53a98180b81dd2a83 WatchSource:0}: Error finding container 6527c52f3e2511303f9a4b73725102495924ad1447a69bc53a98180b81dd2a83: Status 404 returned error can't find the container with id 6527c52f3e2511303f9a4b73725102495924ad1447a69bc53a98180b81dd2a83 Oct 06 08:21:37 crc kubenswrapper[4991]: W1006 08:21:37.757280 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-4cb0c466daefdca12d921d702db59c5ada54500fff53ed370444fb158d7e01e5 WatchSource:0}: Error finding container 4cb0c466daefdca12d921d702db59c5ada54500fff53ed370444fb158d7e01e5: Status 404 returned error can't find the container with id 4cb0c466daefdca12d921d702db59c5ada54500fff53ed370444fb158d7e01e5 Oct 06 08:21:37 crc kubenswrapper[4991]: W1006 08:21:37.786245 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-aa84911ead72f353e89c2b29dcf8593c87f484ad3a6c6fa906db75f73d02deb8 WatchSource:0}: Error finding container aa84911ead72f353e89c2b29dcf8593c87f484ad3a6c6fa906db75f73d02deb8: Status 404 returned error can't find the container with id aa84911ead72f353e89c2b29dcf8593c87f484ad3a6c6fa906db75f73d02deb8 Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.084280 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bb97f693a22d2ffd4730fbc32cde49ec5ac7d9f2f7b1ae333456a5d2b8c2362a"} Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.084376 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"aa84911ead72f353e89c2b29dcf8593c87f484ad3a6c6fa906db75f73d02deb8"} Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.093563 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0d46e00dca88958019a71b99e1b064163b3cb95f45ffd450853c3499bc07a755"} Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.093658 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6527c52f3e2511303f9a4b73725102495924ad1447a69bc53a98180b81dd2a83"} Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.098451 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5e8b24311c62aa880ec6c3a81e604f862d2e5c50951addf11e4346da39f50862"} Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.098539 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4cb0c466daefdca12d921d702db59c5ada54500fff53ed370444fb158d7e01e5"} Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.099321 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.465715 4991 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.503215 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.503852 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.507236 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vtcb6"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.507543 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.507817 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.508108 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.511051 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.511376 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.511661 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.511878 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.512035 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.512209 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.512630 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.512629 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.512738 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.514036 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.514882 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.516758 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vr6sj"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.517766 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.527025 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.527228 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.527922 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.528086 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.528251 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.528398 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.528552 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.528680 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.528965 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.529104 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.530341 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nghng"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.530944 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.531405 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.532749 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lcvbr"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.533541 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.534452 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.535500 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.537338 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-kp5gc"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.538076 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-d2pr9"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.538103 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.538910 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.541685 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.542379 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.543652 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zl4k8"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.545033 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.552970 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.553675 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.554503 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.554958 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.555382 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.556170 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.561346 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.561347 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.575882 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.575996 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.585409 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.586432 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.587993 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.594341 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3ae516b-0866-40bc-b886-44111fef9329-audit-dir\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.594405 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3ae516b-0866-40bc-b886-44111fef9329-serving-cert\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.594449 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d92zz\" (UniqueName: \"kubernetes.io/projected/d3ae516b-0866-40bc-b886-44111fef9329-kube-api-access-d92zz\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.609779 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d3ae516b-0866-40bc-b886-44111fef9329-etcd-client\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.609852 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3ae516b-0866-40bc-b886-44111fef9329-node-pullsecrets\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.609886 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-audit\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.609914 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-trusted-ca-bundle\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.609940 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-etcd-serving-ca\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.609973 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d3ae516b-0866-40bc-b886-44111fef9329-encryption-config\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.609996 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-config\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.610014 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-image-import-ca\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.611220 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.611448 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.611537 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.611672 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.611809 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.611903 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.611978 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.612069 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.612200 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.612185 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.612330 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.612440 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.613168 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.613289 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.613472 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.613639 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.613958 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.614542 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.615319 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.615589 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.615720 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.615817 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.615818 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.615945 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.616221 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.618218 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.618356 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.619076 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.619195 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.619409 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.619540 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.619748 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.619873 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.620434 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.620663 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.620748 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.620773 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.620898 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.621437 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.623464 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.623799 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.628712 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.628842 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.629015 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.629409 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.629669 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.629713 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.629833 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.629887 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.629966 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.629985 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.630089 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.630199 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.630367 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.630433 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.630480 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.630575 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.630718 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.630747 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.630838 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.630848 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.630985 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.629838 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.634936 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.635757 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.636586 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.636828 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.637121 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.637642 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.638008 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.639080 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.652649 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p5tk4"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.653367 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.653856 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.654012 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.654896 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.656154 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.656705 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.656962 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.657064 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.657113 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.657160 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.657428 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.678520 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.680002 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-nxbxt"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.682109 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nxbxt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.682526 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.696778 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.696984 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.697440 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.701199 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.701206 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.702260 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.703133 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8zvhl"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.703942 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8zvhl" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.704209 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.705021 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.707924 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.708545 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hbwft"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.708953 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.709241 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.710929 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3ae516b-0866-40bc-b886-44111fef9329-audit-dir\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.710965 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3ae516b-0866-40bc-b886-44111fef9329-serving-cert\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.710986 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d92zz\" (UniqueName: \"kubernetes.io/projected/d3ae516b-0866-40bc-b886-44111fef9329-kube-api-access-d92zz\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.711021 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d3ae516b-0866-40bc-b886-44111fef9329-etcd-client\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.711042 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3ae516b-0866-40bc-b886-44111fef9329-node-pullsecrets\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.711061 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-audit\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.711076 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-trusted-ca-bundle\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.711094 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-etcd-serving-ca\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.711113 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d3ae516b-0866-40bc-b886-44111fef9329-encryption-config\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.711128 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-config\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.711146 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-image-import-ca\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.712004 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d3ae516b-0866-40bc-b886-44111fef9329-node-pullsecrets\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.712063 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3ae516b-0866-40bc-b886-44111fef9329-audit-dir\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.713355 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-etcd-serving-ca\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.713495 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-image-import-ca\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.714111 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-audit\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.714577 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-trusted-ca-bundle\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.714981 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.715175 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ae516b-0866-40bc-b886-44111fef9329-config\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.717413 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.719341 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3ae516b-0866-40bc-b886-44111fef9329-serving-cert\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.724173 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-g2mjl"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.724747 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.725205 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.725643 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.726679 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.729604 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tgxlv"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.730391 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tgxlv" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.730849 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4pctg"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.731323 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4pctg" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.736695 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d3ae516b-0866-40bc-b886-44111fef9329-encryption-config\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.736789 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.737127 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d3ae516b-0866-40bc-b886-44111fef9329-etcd-client\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.737865 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.738390 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.738463 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.739762 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vxbrw"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.740495 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.740803 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vr6sj"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.741315 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.742465 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-l2z7h"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.743246 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.746201 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.746937 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.747828 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.748107 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.748508 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lcvbr"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.748700 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.749858 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.751843 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vtcb6"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.751867 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.754053 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.754693 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.757599 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.761173 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kp5gc"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.761220 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zl4k8"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.761236 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.765317 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.767148 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.767852 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.769427 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.772848 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.774768 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9nxks"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.775683 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.776856 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.780986 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tgxlv"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.784236 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-d2pr9"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.786199 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.786782 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.790920 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.801513 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-w8qs2"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.803336 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w8qs2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.803622 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.807996 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.809091 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.812828 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzprj\" (UniqueName: \"kubernetes.io/projected/9a3137ff-020a-43c5-867a-9ab59df067ff-kube-api-access-mzprj\") pod \"cluster-image-registry-operator-dc59b4c8b-hwgrf\" (UID: \"9a3137ff-020a-43c5-867a-9ab59df067ff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.814465 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8a8266da-ca7f-4357-8aa7-86aaa7fb23c6-images\") pod \"machine-api-operator-5694c8668f-vr6sj\" (UID: \"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.814613 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.814694 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a3137ff-020a-43c5-867a-9ab59df067ff-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hwgrf\" (UID: \"9a3137ff-020a-43c5-867a-9ab59df067ff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.814789 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a3137ff-020a-43c5-867a-9ab59df067ff-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hwgrf\" (UID: \"9a3137ff-020a-43c5-867a-9ab59df067ff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.814880 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aab780d-af84-45fa-bc9c-b728d4e196d1-config-volume\") pod \"collect-profiles-29328975-9xcfl\" (UID: \"1aab780d-af84-45fa-bc9c-b728d4e196d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.814975 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815064 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8055779b-d4d8-48ce-bb04-f49073e28dc1-serving-cert\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815158 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb83cb02-67d8-4f38-aad6-001ea28de60a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815239 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-etcd-client\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815373 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815421 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb83cb02-67d8-4f38-aad6-001ea28de60a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815458 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-trusted-ca-bundle\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815484 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a8266da-ca7f-4357-8aa7-86aaa7fb23c6-config\") pod \"machine-api-operator-5694c8668f-vr6sj\" (UID: \"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815517 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c941e944-a837-41ff-90b0-29464fc3f02d-console-serving-cert\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815542 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-service-ca\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815568 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815591 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-config\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815621 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2jff\" (UniqueName: \"kubernetes.io/projected/8f5f1533-ca00-4377-853e-c5433faa591e-kube-api-access-t2jff\") pod \"openshift-apiserver-operator-796bbdcf4f-kw55l\" (UID: \"8f5f1533-ca00-4377-853e-c5433faa591e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815646 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ef6468-c4e0-4a26-820b-ddd444b50a07-serving-cert\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815672 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8055779b-d4d8-48ce-bb04-f49073e28dc1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815701 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-audit-policies\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815732 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-oauth-serving-cert\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815759 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815788 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/337bd770-81c5-466f-a32e-9fef462765c8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gc8rv\" (UID: \"337bd770-81c5-466f-a32e-9fef462765c8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815815 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-png2l\" (UniqueName: \"kubernetes.io/projected/1aab780d-af84-45fa-bc9c-b728d4e196d1-kube-api-access-png2l\") pod \"collect-profiles-29328975-9xcfl\" (UID: \"1aab780d-af84-45fa-bc9c-b728d4e196d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815842 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-registry-tls\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815867 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f5f1533-ca00-4377-853e-c5433faa591e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kw55l\" (UID: \"8f5f1533-ca00-4377-853e-c5433faa591e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815893 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnpq5\" (UniqueName: \"kubernetes.io/projected/8055779b-d4d8-48ce-bb04-f49073e28dc1-kube-api-access-jnpq5\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815921 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb83cb02-67d8-4f38-aad6-001ea28de60a-trusted-ca\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815948 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z98x8\" (UniqueName: \"kubernetes.io/projected/11045b9f-1d93-4f1d-852e-02354ef51979-kube-api-access-z98x8\") pod \"cluster-samples-operator-665b6dd947-p6vv2\" (UID: \"11045b9f-1d93-4f1d-852e-02354ef51979\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.815989 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15148cd6-6d64-4a92-a334-b5014bf8b05a-config\") pod \"route-controller-manager-6576b87f9c-9wqrp\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816016 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hk8j\" (UniqueName: \"kubernetes.io/projected/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-kube-api-access-6hk8j\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816042 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816068 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ckkw\" (UniqueName: \"kubernetes.io/projected/8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7-kube-api-access-4ckkw\") pod \"catalog-operator-68c6474976-bps7p\" (UID: \"8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816096 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-bound-sa-token\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816121 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a8266da-ca7f-4357-8aa7-86aaa7fb23c6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vr6sj\" (UID: \"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816147 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c941e944-a837-41ff-90b0-29464fc3f02d-console-oauth-config\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816169 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-console-config\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816192 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15148cd6-6d64-4a92-a334-b5014bf8b05a-client-ca\") pod \"route-controller-manager-6576b87f9c-9wqrp\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816215 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7-srv-cert\") pod \"catalog-operator-68c6474976-bps7p\" (UID: \"8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816240 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816263 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f5f1533-ca00-4377-853e-c5433faa591e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kw55l\" (UID: \"8f5f1533-ca00-4377-853e-c5433faa591e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816285 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aab780d-af84-45fa-bc9c-b728d4e196d1-secret-volume\") pod \"collect-profiles-29328975-9xcfl\" (UID: \"1aab780d-af84-45fa-bc9c-b728d4e196d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816370 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bzcr\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-kube-api-access-6bzcr\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816395 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816425 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816448 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-bps7p\" (UID: \"8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816475 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-audit-policies\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816496 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-serving-cert\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816516 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816535 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816554 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816569 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-encryption-config\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816589 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-audit-dir\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816605 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8055779b-d4d8-48ce-bb04-f49073e28dc1-service-ca-bundle\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816632 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp2d9\" (UniqueName: \"kubernetes.io/projected/337bd770-81c5-466f-a32e-9fef462765c8-kube-api-access-cp2d9\") pod \"openshift-config-operator-7777fb866f-gc8rv\" (UID: \"337bd770-81c5-466f-a32e-9fef462765c8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816651 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac30cd53-f61e-4f56-8110-4eacc0aade3f-audit-dir\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816667 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb83cb02-67d8-4f38-aad6-001ea28de60a-registry-certificates\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816684 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15148cd6-6d64-4a92-a334-b5014bf8b05a-serving-cert\") pod \"route-controller-manager-6576b87f9c-9wqrp\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816699 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl7r2\" (UniqueName: \"kubernetes.io/projected/15148cd6-6d64-4a92-a334-b5014bf8b05a-kube-api-access-gl7r2\") pod \"route-controller-manager-6576b87f9c-9wqrp\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816714 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/11045b9f-1d93-4f1d-852e-02354ef51979-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p6vv2\" (UID: \"11045b9f-1d93-4f1d-852e-02354ef51979\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816731 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss4zd\" (UniqueName: \"kubernetes.io/projected/ac30cd53-f61e-4f56-8110-4eacc0aade3f-kube-api-access-ss4zd\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816746 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/337bd770-81c5-466f-a32e-9fef462765c8-serving-cert\") pod \"openshift-config-operator-7777fb866f-gc8rv\" (UID: \"337bd770-81c5-466f-a32e-9fef462765c8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816766 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4c8j\" (UniqueName: \"kubernetes.io/projected/c941e944-a837-41ff-90b0-29464fc3f02d-kube-api-access-c4c8j\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816782 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8055779b-d4d8-48ce-bb04-f49073e28dc1-config\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816798 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816816 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-289sj\" (UniqueName: \"kubernetes.io/projected/f4ef6468-c4e0-4a26-820b-ddd444b50a07-kube-api-access-289sj\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816834 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q6gh\" (UniqueName: \"kubernetes.io/projected/8a8266da-ca7f-4357-8aa7-86aaa7fb23c6-kube-api-access-5q6gh\") pod \"machine-api-operator-5694c8668f-vr6sj\" (UID: \"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816857 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a3137ff-020a-43c5-867a-9ab59df067ff-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hwgrf\" (UID: \"9a3137ff-020a-43c5-867a-9ab59df067ff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816880 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816898 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.816915 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-client-ca\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.817601 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nghng"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.817638 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5"] Oct 06 08:21:38 crc kubenswrapper[4991]: E1006 08:21:38.817830 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:39.317814144 +0000 UTC m=+151.055564165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.818284 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.820228 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nxbxt"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.821993 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.823455 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8zvhl"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.824903 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hbwft"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.826403 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.826604 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.828641 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p5tk4"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.831250 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4pctg"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.833489 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mwdsr"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.834184 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mwdsr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.836438 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pgp24"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.837750 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.839182 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vxbrw"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.841547 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.843904 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.845896 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.846403 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9nxks"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.848009 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-l2z7h"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.849344 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pgp24"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.850743 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w8qs2"] Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.874060 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.886232 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.909704 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.917884 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.918077 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hk8j\" (UniqueName: \"kubernetes.io/projected/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-kube-api-access-6hk8j\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.918109 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.918132 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ckkw\" (UniqueName: \"kubernetes.io/projected/8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7-kube-api-access-4ckkw\") pod \"catalog-operator-68c6474976-bps7p\" (UID: \"8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.918162 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15148cd6-6d64-4a92-a334-b5014bf8b05a-config\") pod \"route-controller-manager-6576b87f9c-9wqrp\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.918332 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14e2aab8-50a5-4db6-9efa-579949a454bb-metrics-tls\") pod \"dns-operator-744455d44c-8zvhl\" (UID: \"14e2aab8-50a5-4db6-9efa-579949a454bb\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zvhl" Oct 06 08:21:38 crc kubenswrapper[4991]: E1006 08:21:38.918574 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:39.418324789 +0000 UTC m=+151.156074810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.918759 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcw9l\" (UniqueName: \"kubernetes.io/projected/c78d976c-800d-4739-bdd4-5b8e5943c0a5-kube-api-access-dcw9l\") pod \"migrator-59844c95c7-tgxlv\" (UID: \"c78d976c-800d-4739-bdd4-5b8e5943c0a5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tgxlv" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.918864 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a8266da-ca7f-4357-8aa7-86aaa7fb23c6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vr6sj\" (UID: \"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.918941 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-bound-sa-token\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919018 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c941e944-a837-41ff-90b0-29464fc3f02d-console-oauth-config\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919076 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-console-config\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919102 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15148cd6-6d64-4a92-a334-b5014bf8b05a-client-ca\") pod \"route-controller-manager-6576b87f9c-9wqrp\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919151 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7-srv-cert\") pod \"catalog-operator-68c6474976-bps7p\" (UID: \"8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919178 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919268 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919332 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f5f1533-ca00-4377-853e-c5433faa591e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kw55l\" (UID: \"8f5f1533-ca00-4377-853e-c5433faa591e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919363 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aab780d-af84-45fa-bc9c-b728d4e196d1-secret-volume\") pod \"collect-profiles-29328975-9xcfl\" (UID: \"1aab780d-af84-45fa-bc9c-b728d4e196d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919409 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919433 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-bps7p\" (UID: \"8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919496 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bzcr\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-kube-api-access-6bzcr\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919523 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919577 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-audit-policies\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919609 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919659 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919691 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-serving-cert\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919734 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919765 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-encryption-config\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919815 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-audit-dir\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919851 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac30cd53-f61e-4f56-8110-4eacc0aade3f-audit-dir\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919897 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8055779b-d4d8-48ce-bb04-f49073e28dc1-service-ca-bundle\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919924 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp2d9\" (UniqueName: \"kubernetes.io/projected/337bd770-81c5-466f-a32e-9fef462765c8-kube-api-access-cp2d9\") pod \"openshift-config-operator-7777fb866f-gc8rv\" (UID: \"337bd770-81c5-466f-a32e-9fef462765c8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919985 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb83cb02-67d8-4f38-aad6-001ea28de60a-registry-certificates\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920010 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/337bd770-81c5-466f-a32e-9fef462765c8-serving-cert\") pod \"openshift-config-operator-7777fb866f-gc8rv\" (UID: \"337bd770-81c5-466f-a32e-9fef462765c8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.919737 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15148cd6-6d64-4a92-a334-b5014bf8b05a-config\") pod \"route-controller-manager-6576b87f9c-9wqrp\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920090 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-console-config\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920060 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d-signing-key\") pod \"service-ca-9c57cc56f-vxbrw\" (UID: \"0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d\") " pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920210 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15148cd6-6d64-4a92-a334-b5014bf8b05a-serving-cert\") pod \"route-controller-manager-6576b87f9c-9wqrp\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920259 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl7r2\" (UniqueName: \"kubernetes.io/projected/15148cd6-6d64-4a92-a334-b5014bf8b05a-kube-api-access-gl7r2\") pod \"route-controller-manager-6576b87f9c-9wqrp\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920284 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/11045b9f-1d93-4f1d-852e-02354ef51979-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p6vv2\" (UID: \"11045b9f-1d93-4f1d-852e-02354ef51979\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920332 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss4zd\" (UniqueName: \"kubernetes.io/projected/ac30cd53-f61e-4f56-8110-4eacc0aade3f-kube-api-access-ss4zd\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920366 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4c8j\" (UniqueName: \"kubernetes.io/projected/c941e944-a837-41ff-90b0-29464fc3f02d-kube-api-access-c4c8j\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920388 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8055779b-d4d8-48ce-bb04-f49073e28dc1-config\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920407 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-289sj\" (UniqueName: \"kubernetes.io/projected/f4ef6468-c4e0-4a26-820b-ddd444b50a07-kube-api-access-289sj\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920445 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q6gh\" (UniqueName: \"kubernetes.io/projected/8a8266da-ca7f-4357-8aa7-86aaa7fb23c6-kube-api-access-5q6gh\") pod \"machine-api-operator-5694c8668f-vr6sj\" (UID: \"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920465 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a3137ff-020a-43c5-867a-9ab59df067ff-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hwgrf\" (UID: \"9a3137ff-020a-43c5-867a-9ab59df067ff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920500 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920524 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920552 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920586 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-client-ca\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920610 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15148cd6-6d64-4a92-a334-b5014bf8b05a-client-ca\") pod \"route-controller-manager-6576b87f9c-9wqrp\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920615 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzprj\" (UniqueName: \"kubernetes.io/projected/9a3137ff-020a-43c5-867a-9ab59df067ff-kube-api-access-mzprj\") pod \"cluster-image-registry-operator-dc59b4c8b-hwgrf\" (UID: \"9a3137ff-020a-43c5-867a-9ab59df067ff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920689 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8055779b-d4d8-48ce-bb04-f49073e28dc1-service-ca-bundle\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920691 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/227e09e3-5d47-4490-9135-0ecb29c18623-proxy-tls\") pod \"machine-config-controller-84d6567774-dlk6z\" (UID: \"227e09e3-5d47-4490-9135-0ecb29c18623\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920736 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-audit-dir\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920737 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8a8266da-ca7f-4357-8aa7-86aaa7fb23c6-images\") pod \"machine-api-operator-5694c8668f-vr6sj\" (UID: \"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920772 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s2dt\" (UniqueName: \"kubernetes.io/projected/1c08ea1b-603c-4e53-89b6-bff65bd7154f-kube-api-access-9s2dt\") pod \"olm-operator-6b444d44fb-rd6sk\" (UID: \"1c08ea1b-603c-4e53-89b6-bff65bd7154f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920799 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920819 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a3137ff-020a-43c5-867a-9ab59df067ff-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hwgrf\" (UID: \"9a3137ff-020a-43c5-867a-9ab59df067ff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920837 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a3137ff-020a-43c5-867a-9ab59df067ff-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hwgrf\" (UID: \"9a3137ff-020a-43c5-867a-9ab59df067ff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920858 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aab780d-af84-45fa-bc9c-b728d4e196d1-config-volume\") pod \"collect-profiles-29328975-9xcfl\" (UID: \"1aab780d-af84-45fa-bc9c-b728d4e196d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920879 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1c08ea1b-603c-4e53-89b6-bff65bd7154f-srv-cert\") pod \"olm-operator-6b444d44fb-rd6sk\" (UID: \"1c08ea1b-603c-4e53-89b6-bff65bd7154f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920920 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920943 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8055779b-d4d8-48ce-bb04-f49073e28dc1-serving-cert\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920964 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb83cb02-67d8-4f38-aad6-001ea28de60a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.920984 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-etcd-client\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921007 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921056 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d-signing-cabundle\") pod \"service-ca-9c57cc56f-vxbrw\" (UID: \"0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d\") " pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921082 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb83cb02-67d8-4f38-aad6-001ea28de60a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921101 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-trusted-ca-bundle\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921118 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a8266da-ca7f-4357-8aa7-86aaa7fb23c6-config\") pod \"machine-api-operator-5694c8668f-vr6sj\" (UID: \"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921135 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-service-ca\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921154 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921171 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-config\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921200 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c941e944-a837-41ff-90b0-29464fc3f02d-console-serving-cert\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921218 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsvg4\" (UniqueName: \"kubernetes.io/projected/14e2aab8-50a5-4db6-9efa-579949a454bb-kube-api-access-vsvg4\") pod \"dns-operator-744455d44c-8zvhl\" (UID: \"14e2aab8-50a5-4db6-9efa-579949a454bb\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zvhl" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921238 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/227e09e3-5d47-4490-9135-0ecb29c18623-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dlk6z\" (UID: \"227e09e3-5d47-4490-9135-0ecb29c18623\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921266 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jff\" (UniqueName: \"kubernetes.io/projected/8f5f1533-ca00-4377-853e-c5433faa591e-kube-api-access-t2jff\") pod \"openshift-apiserver-operator-796bbdcf4f-kw55l\" (UID: \"8f5f1533-ca00-4377-853e-c5433faa591e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921284 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ef6468-c4e0-4a26-820b-ddd444b50a07-serving-cert\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921343 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8055779b-d4d8-48ce-bb04-f49073e28dc1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921374 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mrs4\" (UniqueName: \"kubernetes.io/projected/0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d-kube-api-access-6mrs4\") pod \"service-ca-9c57cc56f-vxbrw\" (UID: \"0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d\") " pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921408 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-audit-policies\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921429 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/337bd770-81c5-466f-a32e-9fef462765c8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gc8rv\" (UID: \"337bd770-81c5-466f-a32e-9fef462765c8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921448 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-png2l\" (UniqueName: \"kubernetes.io/projected/1aab780d-af84-45fa-bc9c-b728d4e196d1-kube-api-access-png2l\") pod \"collect-profiles-29328975-9xcfl\" (UID: \"1aab780d-af84-45fa-bc9c-b728d4e196d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921466 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-oauth-serving-cert\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921485 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921501 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-registry-tls\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921527 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f5f1533-ca00-4377-853e-c5433faa591e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kw55l\" (UID: \"8f5f1533-ca00-4377-853e-c5433faa591e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921544 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1c08ea1b-603c-4e53-89b6-bff65bd7154f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rd6sk\" (UID: \"1c08ea1b-603c-4e53-89b6-bff65bd7154f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921563 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnpq5\" (UniqueName: \"kubernetes.io/projected/8055779b-d4d8-48ce-bb04-f49073e28dc1-kube-api-access-jnpq5\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921582 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb83cb02-67d8-4f38-aad6-001ea28de60a-trusted-ca\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921598 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z98x8\" (UniqueName: \"kubernetes.io/projected/11045b9f-1d93-4f1d-852e-02354ef51979-kube-api-access-z98x8\") pod \"cluster-samples-operator-665b6dd947-p6vv2\" (UID: \"11045b9f-1d93-4f1d-852e-02354ef51979\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921614 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9th5\" (UniqueName: \"kubernetes.io/projected/227e09e3-5d47-4490-9135-0ecb29c18623-kube-api-access-k9th5\") pod \"machine-config-controller-84d6567774-dlk6z\" (UID: \"227e09e3-5d47-4490-9135-0ecb29c18623\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921682 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac30cd53-f61e-4f56-8110-4eacc0aade3f-audit-dir\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.921958 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8a8266da-ca7f-4357-8aa7-86aaa7fb23c6-images\") pod \"machine-api-operator-5694c8668f-vr6sj\" (UID: \"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.922157 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.922375 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-audit-policies\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.922504 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb83cb02-67d8-4f38-aad6-001ea28de60a-registry-certificates\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.923243 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7-srv-cert\") pod \"catalog-operator-68c6474976-bps7p\" (UID: \"8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.923348 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8055779b-d4d8-48ce-bb04-f49073e28dc1-config\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.923754 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.924206 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a8266da-ca7f-4357-8aa7-86aaa7fb23c6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vr6sj\" (UID: \"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.924213 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a3137ff-020a-43c5-867a-9ab59df067ff-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hwgrf\" (UID: \"9a3137ff-020a-43c5-867a-9ab59df067ff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.925199 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-encryption-config\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.925657 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.925925 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.926002 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aab780d-af84-45fa-bc9c-b728d4e196d1-config-volume\") pod \"collect-profiles-29328975-9xcfl\" (UID: \"1aab780d-af84-45fa-bc9c-b728d4e196d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.926343 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c941e944-a837-41ff-90b0-29464fc3f02d-console-oauth-config\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.926792 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-config\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.926808 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 06 08:21:38 crc kubenswrapper[4991]: E1006 08:21:38.927172 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:39.427156973 +0000 UTC m=+151.164906994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.928088 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-audit-policies\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.928316 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/337bd770-81c5-466f-a32e-9fef462765c8-serving-cert\") pod \"openshift-config-operator-7777fb866f-gc8rv\" (UID: \"337bd770-81c5-466f-a32e-9fef462765c8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.928598 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/337bd770-81c5-466f-a32e-9fef462765c8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gc8rv\" (UID: \"337bd770-81c5-466f-a32e-9fef462765c8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.929250 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-oauth-serving-cert\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.929331 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.929535 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb83cb02-67d8-4f38-aad6-001ea28de60a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.929797 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f5f1533-ca00-4377-853e-c5433faa591e-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kw55l\" (UID: \"8f5f1533-ca00-4377-853e-c5433faa591e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.929850 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8055779b-d4d8-48ce-bb04-f49073e28dc1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.930178 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.930337 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f5f1533-ca00-4377-853e-c5433faa591e-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kw55l\" (UID: \"8f5f1533-ca00-4377-853e-c5433faa591e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.930576 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.930920 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a8266da-ca7f-4357-8aa7-86aaa7fb23c6-config\") pod \"machine-api-operator-5694c8668f-vr6sj\" (UID: \"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.931197 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-client-ca\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.931623 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-service-ca\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.932097 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-trusted-ca-bundle\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.932189 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.933270 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7-profile-collector-cert\") pod \"catalog-operator-68c6474976-bps7p\" (UID: \"8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.934222 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.935980 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb83cb02-67d8-4f38-aad6-001ea28de60a-trusted-ca\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.935965 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c941e944-a837-41ff-90b0-29464fc3f02d-console-serving-cert\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.937263 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb83cb02-67d8-4f38-aad6-001ea28de60a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.937627 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aab780d-af84-45fa-bc9c-b728d4e196d1-secret-volume\") pod \"collect-profiles-29328975-9xcfl\" (UID: \"1aab780d-af84-45fa-bc9c-b728d4e196d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.939600 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/11045b9f-1d93-4f1d-852e-02354ef51979-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p6vv2\" (UID: \"11045b9f-1d93-4f1d-852e-02354ef51979\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.939702 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8055779b-d4d8-48ce-bb04-f49073e28dc1-serving-cert\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.940257 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.941934 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-etcd-client\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.941938 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-serving-cert\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.942266 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.942419 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.943055 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-registry-tls\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.945755 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a3137ff-020a-43c5-867a-9ab59df067ff-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hwgrf\" (UID: \"9a3137ff-020a-43c5-867a-9ab59df067ff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.945999 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15148cd6-6d64-4a92-a334-b5014bf8b05a-serving-cert\") pod \"route-controller-manager-6576b87f9c-9wqrp\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.946695 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.947173 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ef6468-c4e0-4a26-820b-ddd444b50a07-serving-cert\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.947728 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.966074 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 06 08:21:38 crc kubenswrapper[4991]: I1006 08:21:38.986647 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.007262 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.022091 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.022336 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:39.522202688 +0000 UTC m=+151.259952719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.022621 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d-signing-key\") pod \"service-ca-9c57cc56f-vxbrw\" (UID: \"0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d\") " pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.022701 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/227e09e3-5d47-4490-9135-0ecb29c18623-proxy-tls\") pod \"machine-config-controller-84d6567774-dlk6z\" (UID: \"227e09e3-5d47-4490-9135-0ecb29c18623\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.022749 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s2dt\" (UniqueName: \"kubernetes.io/projected/1c08ea1b-603c-4e53-89b6-bff65bd7154f-kube-api-access-9s2dt\") pod \"olm-operator-6b444d44fb-rd6sk\" (UID: \"1c08ea1b-603c-4e53-89b6-bff65bd7154f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.022784 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1c08ea1b-603c-4e53-89b6-bff65bd7154f-srv-cert\") pod \"olm-operator-6b444d44fb-rd6sk\" (UID: \"1c08ea1b-603c-4e53-89b6-bff65bd7154f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.023008 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.023681 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d-signing-cabundle\") pod \"service-ca-9c57cc56f-vxbrw\" (UID: \"0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d\") " pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.023846 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:39.52373894 +0000 UTC m=+151.261488951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.023979 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsvg4\" (UniqueName: \"kubernetes.io/projected/14e2aab8-50a5-4db6-9efa-579949a454bb-kube-api-access-vsvg4\") pod \"dns-operator-744455d44c-8zvhl\" (UID: \"14e2aab8-50a5-4db6-9efa-579949a454bb\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zvhl" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.024043 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/227e09e3-5d47-4490-9135-0ecb29c18623-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dlk6z\" (UID: \"227e09e3-5d47-4490-9135-0ecb29c18623\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.024176 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mrs4\" (UniqueName: \"kubernetes.io/projected/0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d-kube-api-access-6mrs4\") pod \"service-ca-9c57cc56f-vxbrw\" (UID: \"0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d\") " pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.024254 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1c08ea1b-603c-4e53-89b6-bff65bd7154f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rd6sk\" (UID: \"1c08ea1b-603c-4e53-89b6-bff65bd7154f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.024337 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9th5\" (UniqueName: \"kubernetes.io/projected/227e09e3-5d47-4490-9135-0ecb29c18623-kube-api-access-k9th5\") pod \"machine-config-controller-84d6567774-dlk6z\" (UID: \"227e09e3-5d47-4490-9135-0ecb29c18623\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.024389 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14e2aab8-50a5-4db6-9efa-579949a454bb-metrics-tls\") pod \"dns-operator-744455d44c-8zvhl\" (UID: \"14e2aab8-50a5-4db6-9efa-579949a454bb\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zvhl" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.024869 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcw9l\" (UniqueName: \"kubernetes.io/projected/c78d976c-800d-4739-bdd4-5b8e5943c0a5-kube-api-access-dcw9l\") pod \"migrator-59844c95c7-tgxlv\" (UID: \"c78d976c-800d-4739-bdd4-5b8e5943c0a5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tgxlv" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.030580 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1c08ea1b-603c-4e53-89b6-bff65bd7154f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rd6sk\" (UID: \"1c08ea1b-603c-4e53-89b6-bff65bd7154f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.032065 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/227e09e3-5d47-4490-9135-0ecb29c18623-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dlk6z\" (UID: \"227e09e3-5d47-4490-9135-0ecb29c18623\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.034716 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1c08ea1b-603c-4e53-89b6-bff65bd7154f-srv-cert\") pod \"olm-operator-6b444d44fb-rd6sk\" (UID: \"1c08ea1b-603c-4e53-89b6-bff65bd7154f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.035552 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.041836 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/14e2aab8-50a5-4db6-9efa-579949a454bb-metrics-tls\") pod \"dns-operator-744455d44c-8zvhl\" (UID: \"14e2aab8-50a5-4db6-9efa-579949a454bb\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zvhl" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.051729 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.065868 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.085633 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.109760 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.127328 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.132886 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.133121 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:39.63309364 +0000 UTC m=+151.370843661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.133400 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.133825 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:39.63381062 +0000 UTC m=+151.371560651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.147195 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.166944 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.187513 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.206263 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.227028 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.235083 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.235236 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:39.735207059 +0000 UTC m=+151.472957100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.235349 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.235785 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:39.735773665 +0000 UTC m=+151.473523766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.245827 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.266556 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.286960 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.306651 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.327506 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.336988 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.337505 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:39.837456943 +0000 UTC m=+151.575207004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.346606 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.378978 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.388428 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.434505 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d92zz\" (UniqueName: \"kubernetes.io/projected/d3ae516b-0866-40bc-b886-44111fef9329-kube-api-access-d92zz\") pod \"apiserver-76f77b778f-d2pr9\" (UID: \"d3ae516b-0866-40bc-b886-44111fef9329\") " pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.439160 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.440443 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:39.940416795 +0000 UTC m=+151.678166856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.447694 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.467465 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.487606 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.507725 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.527139 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.540616 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.540893 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.040832888 +0000 UTC m=+151.778582909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.541242 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.541913 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.041880618 +0000 UTC m=+151.779630679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.547664 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.568164 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.586818 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.607458 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.626572 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.642800 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.643076 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.143043951 +0000 UTC m=+151.880794012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.643592 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.644175 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.144142831 +0000 UTC m=+151.881892892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.646582 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.654663 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.670878 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.689623 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.707987 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.727265 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.744421 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.744989 4991 request.go:700] Waited for 1.014282977s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.745205 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.245170961 +0000 UTC m=+151.982920992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.747504 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.767076 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.793207 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.807872 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.830043 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.841649 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/227e09e3-5d47-4490-9135-0ecb29c18623-proxy-tls\") pod \"machine-config-controller-84d6567774-dlk6z\" (UID: \"227e09e3-5d47-4490-9135-0ecb29c18623\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.848079 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.847285 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.848535 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.348273438 +0000 UTC m=+152.086023499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.866873 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.875579 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d-signing-cabundle\") pod \"service-ca-9c57cc56f-vxbrw\" (UID: \"0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d\") " pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.887215 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.908777 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.919997 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d-signing-key\") pod \"service-ca-9c57cc56f-vxbrw\" (UID: \"0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d\") " pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.926401 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-d2pr9"] Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.927497 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.946837 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.949832 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.950050 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.450020047 +0000 UTC m=+152.187770078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.950443 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:39 crc kubenswrapper[4991]: E1006 08:21:39.950916 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.450908613 +0000 UTC m=+152.188658634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.976017 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 06 08:21:39 crc kubenswrapper[4991]: I1006 08:21:39.986825 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.008536 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.027810 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.047118 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.051348 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.051560 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.551531231 +0000 UTC m=+152.289281252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.051687 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.052186 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.552171289 +0000 UTC m=+152.289921310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.067615 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.087756 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.106819 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.126806 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" event={"ID":"d3ae516b-0866-40bc-b886-44111fef9329","Type":"ContainerStarted","Data":"4bc85daad0068e263e04a88893bdd68d3d597a81ab80cf5937dca068177dfddc"} Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.127830 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.146222 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.152969 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.153188 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.653138027 +0000 UTC m=+152.390888048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.153503 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.153853 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.653840796 +0000 UTC m=+152.391590807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.166652 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.194727 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.206773 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.226752 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.247833 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.255229 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.255373 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.755346919 +0000 UTC m=+152.493096960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.255494 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.255946 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.755933205 +0000 UTC m=+152.493683226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.267717 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.288220 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.306958 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.327816 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.347734 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.356474 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.356751 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.856721768 +0000 UTC m=+152.594471799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.356832 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.357337 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.857321915 +0000 UTC m=+152.595071956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.366446 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.386380 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.406530 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.427135 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.446707 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.458592 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.458798 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.958761406 +0000 UTC m=+152.696511467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.459211 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.459738 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:40.959716172 +0000 UTC m=+152.697466233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.467405 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.487464 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.507001 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.526993 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.546695 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.560794 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.560998 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.060956007 +0000 UTC m=+152.798706068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.561594 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.561990 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.061977746 +0000 UTC m=+152.799727767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.566944 4991 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.587161 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.607709 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.663062 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.663324 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.163262353 +0000 UTC m=+152.901012444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.663768 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.664355 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.164281091 +0000 UTC m=+152.902031152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.685908 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ckkw\" (UniqueName: \"kubernetes.io/projected/8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7-kube-api-access-4ckkw\") pod \"catalog-operator-68c6474976-bps7p\" (UID: \"8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.687520 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hk8j\" (UniqueName: \"kubernetes.io/projected/b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5-kube-api-access-6hk8j\") pod \"apiserver-7bbb656c7d-zlfb2\" (UID: \"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.706957 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-bound-sa-token\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.732206 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bzcr\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-kube-api-access-6bzcr\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.742340 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp2d9\" (UniqueName: \"kubernetes.io/projected/337bd770-81c5-466f-a32e-9fef462765c8-kube-api-access-cp2d9\") pod \"openshift-config-operator-7777fb866f-gc8rv\" (UID: \"337bd770-81c5-466f-a32e-9fef462765c8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.764348 4991 request.go:700] Waited for 1.84186874s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.764739 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.764929 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.264890159 +0000 UTC m=+153.002640190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.765111 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.765499 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.265488815 +0000 UTC m=+153.003238836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.768798 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzprj\" (UniqueName: \"kubernetes.io/projected/9a3137ff-020a-43c5-867a-9ab59df067ff-kube-api-access-mzprj\") pod \"cluster-image-registry-operator-dc59b4c8b-hwgrf\" (UID: \"9a3137ff-020a-43c5-867a-9ab59df067ff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.783946 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2jff\" (UniqueName: \"kubernetes.io/projected/8f5f1533-ca00-4377-853e-c5433faa591e-kube-api-access-t2jff\") pod \"openshift-apiserver-operator-796bbdcf4f-kw55l\" (UID: \"8f5f1533-ca00-4377-853e-c5433faa591e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.801632 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.814409 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl7r2\" (UniqueName: \"kubernetes.io/projected/15148cd6-6d64-4a92-a334-b5014bf8b05a-kube-api-access-gl7r2\") pod \"route-controller-manager-6576b87f9c-9wqrp\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.833834 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a3137ff-020a-43c5-867a-9ab59df067ff-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hwgrf\" (UID: \"9a3137ff-020a-43c5-867a-9ab59df067ff\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.857008 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q6gh\" (UniqueName: \"kubernetes.io/projected/8a8266da-ca7f-4357-8aa7-86aaa7fb23c6-kube-api-access-5q6gh\") pod \"machine-api-operator-5694c8668f-vr6sj\" (UID: \"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.865768 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-png2l\" (UniqueName: \"kubernetes.io/projected/1aab780d-af84-45fa-bc9c-b728d4e196d1-kube-api-access-png2l\") pod \"collect-profiles-29328975-9xcfl\" (UID: \"1aab780d-af84-45fa-bc9c-b728d4e196d1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.866158 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.866448 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.366414832 +0000 UTC m=+153.104164893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.867330 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.868035 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.367972936 +0000 UTC m=+153.105722997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.885929 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.893825 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.895512 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-289sj\" (UniqueName: \"kubernetes.io/projected/f4ef6468-c4e0-4a26-820b-ddd444b50a07-kube-api-access-289sj\") pod \"controller-manager-879f6c89f-lcvbr\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.902975 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.915998 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4c8j\" (UniqueName: \"kubernetes.io/projected/c941e944-a837-41ff-90b0-29464fc3f02d-kube-api-access-c4c8j\") pod \"console-f9d7485db-kp5gc\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.926258 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.934411 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss4zd\" (UniqueName: \"kubernetes.io/projected/ac30cd53-f61e-4f56-8110-4eacc0aade3f-kube-api-access-ss4zd\") pod \"oauth-openshift-558db77b4-vtcb6\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.937798 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.957675 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.961012 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnpq5\" (UniqueName: \"kubernetes.io/projected/8055779b-d4d8-48ce-bb04-f49073e28dc1-kube-api-access-jnpq5\") pod \"authentication-operator-69f744f599-nghng\" (UID: \"8055779b-d4d8-48ce-bb04-f49073e28dc1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.968863 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.969035 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.469002245 +0000 UTC m=+153.206752266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.969178 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:40 crc kubenswrapper[4991]: E1006 08:21:40.969837 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.469823018 +0000 UTC m=+153.207573039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.970141 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z98x8\" (UniqueName: \"kubernetes.io/projected/11045b9f-1d93-4f1d-852e-02354ef51979-kube-api-access-z98x8\") pod \"cluster-samples-operator-665b6dd947-p6vv2\" (UID: \"11045b9f-1d93-4f1d-852e-02354ef51979\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.989188 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s2dt\" (UniqueName: \"kubernetes.io/projected/1c08ea1b-603c-4e53-89b6-bff65bd7154f-kube-api-access-9s2dt\") pod \"olm-operator-6b444d44fb-rd6sk\" (UID: \"1c08ea1b-603c-4e53-89b6-bff65bd7154f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:40 crc kubenswrapper[4991]: I1006 08:21:40.995981 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.013577 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsvg4\" (UniqueName: \"kubernetes.io/projected/14e2aab8-50a5-4db6-9efa-579949a454bb-kube-api-access-vsvg4\") pod \"dns-operator-744455d44c-8zvhl\" (UID: \"14e2aab8-50a5-4db6-9efa-579949a454bb\") " pod="openshift-dns-operator/dns-operator-744455d44c-8zvhl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.026758 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.032042 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mrs4\" (UniqueName: \"kubernetes.io/projected/0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d-kube-api-access-6mrs4\") pod \"service-ca-9c57cc56f-vxbrw\" (UID: \"0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d\") " pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.048548 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcw9l\" (UniqueName: \"kubernetes.io/projected/c78d976c-800d-4739-bdd4-5b8e5943c0a5-kube-api-access-dcw9l\") pod \"migrator-59844c95c7-tgxlv\" (UID: \"c78d976c-800d-4739-bdd4-5b8e5943c0a5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tgxlv" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.051493 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.065999 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9th5\" (UniqueName: \"kubernetes.io/projected/227e09e3-5d47-4490-9135-0ecb29c18623-kube-api-access-k9th5\") pod \"machine-config-controller-84d6567774-dlk6z\" (UID: \"227e09e3-5d47-4490-9135-0ecb29c18623\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.072200 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:41 crc kubenswrapper[4991]: E1006 08:21:41.072510 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.572489253 +0000 UTC m=+153.310239264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.079455 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.079952 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tgxlv" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.081580 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.122927 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.132580 4991 generic.go:334] "Generic (PLEG): container finished" podID="d3ae516b-0866-40bc-b886-44111fef9329" containerID="9548eb8eb2d7c5fa4f6611150d95ff6e23ad568934a6d608e2b8d0f0fd737020" exitCode=0 Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.132667 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" event={"ID":"d3ae516b-0866-40bc-b886-44111fef9329","Type":"ContainerDied","Data":"9548eb8eb2d7c5fa4f6611150d95ff6e23ad568934a6d608e2b8d0f0fd737020"} Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.144344 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.146948 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.170107 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173173 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce698484-23c3-49bc-94f8-6a5fc2efccf5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6t2r\" (UID: \"ce698484-23c3-49bc-94f8-6a5fc2efccf5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173212 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20802e3a-7d2a-43a1-83cb-e50b6ed66d96-cert\") pod \"ingress-canary-w8qs2\" (UID: \"20802e3a-7d2a-43a1-83cb-e50b6ed66d96\") " pod="openshift-ingress-canary/ingress-canary-w8qs2" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173242 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2eff5942-a2f5-4b2e-8a8a-d1555ffe952d-trusted-ca\") pod \"console-operator-58897d9998-l2z7h\" (UID: \"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d\") " pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173271 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-plugins-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173340 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rnbs\" (UniqueName: \"kubernetes.io/projected/0a7333dc-b6d2-4513-8574-a95446be656b-kube-api-access-4rnbs\") pod \"marketplace-operator-79b997595-p5tk4\" (UID: \"0a7333dc-b6d2-4513-8574-a95446be656b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173369 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v8s7\" (UniqueName: \"kubernetes.io/projected/3642df91-ce0a-494a-a3a6-58c5ae92f69a-kube-api-access-9v8s7\") pod \"package-server-manager-789f6589d5-2d9tn\" (UID: \"3642df91-ce0a-494a-a3a6-58c5ae92f69a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173428 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eff5942-a2f5-4b2e-8a8a-d1555ffe952d-config\") pod \"console-operator-58897d9998-l2z7h\" (UID: \"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d\") " pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173515 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/20560fe2-fd64-4aa1-9d9c-0f0046a10141-stats-auth\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173555 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a7333dc-b6d2-4513-8574-a95446be656b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p5tk4\" (UID: \"0a7333dc-b6d2-4513-8574-a95446be656b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173572 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eff5942-a2f5-4b2e-8a8a-d1555ffe952d-serving-cert\") pod \"console-operator-58897d9998-l2z7h\" (UID: \"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d\") " pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173589 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh6x9\" (UniqueName: \"kubernetes.io/projected/20560fe2-fd64-4aa1-9d9c-0f0046a10141-kube-api-access-jh6x9\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173608 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce698484-23c3-49bc-94f8-6a5fc2efccf5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6t2r\" (UID: \"ce698484-23c3-49bc-94f8-6a5fc2efccf5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173649 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4af307fe-9280-424c-8cba-6f63aead910b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lct7m\" (UID: \"4af307fe-9280-424c-8cba-6f63aead910b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173676 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/137e3b24-3531-43e3-8bdd-17dda6d83922-metrics-tls\") pod \"ingress-operator-5b745b69d9-hzl7x\" (UID: \"137e3b24-3531-43e3-8bdd-17dda6d83922\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173715 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24b7d367-f4b1-4ba9-b1f0-77e706c85ac4-config-volume\") pod \"dns-default-9nxks\" (UID: \"24b7d367-f4b1-4ba9-b1f0-77e706c85ac4\") " pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173735 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d3960fea-a405-4e1e-b9a8-14c574ad45e8-node-bootstrap-token\") pod \"machine-config-server-mwdsr\" (UID: \"d3960fea-a405-4e1e-b9a8-14c574ad45e8\") " pod="openshift-machine-config-operator/machine-config-server-mwdsr" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173751 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/20560fe2-fd64-4aa1-9d9c-0f0046a10141-default-certificate\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173793 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgxsg\" (UniqueName: \"kubernetes.io/projected/7f68050b-a4cf-4dd6-bfaa-6ea9674af578-kube-api-access-wgxsg\") pod \"service-ca-operator-777779d784-lzn9r\" (UID: \"7f68050b-a4cf-4dd6-bfaa-6ea9674af578\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173810 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02f0b661-1f67-4d17-ae49-2c0c703a50ee-config\") pod \"kube-apiserver-operator-766d6c64bb-6g7qx\" (UID: \"02f0b661-1f67-4d17-ae49-2c0c703a50ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173824 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-serving-cert\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.173871 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02f0b661-1f67-4d17-ae49-2c0c703a50ee-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6g7qx\" (UID: \"02f0b661-1f67-4d17-ae49-2c0c703a50ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.176106 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f65e5c1d-a9f2-4954-b72e-27f2d2895ac0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rjt\" (UID: \"f65e5c1d-a9f2-4954-b72e-27f2d2895ac0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.176150 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7760a6d-1e1d-413f-9fbe-e89e4d621ff6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lv2fp\" (UID: \"f7760a6d-1e1d-413f-9fbe-e89e4d621ff6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.176178 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a7333dc-b6d2-4513-8574-a95446be656b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p5tk4\" (UID: \"0a7333dc-b6d2-4513-8574-a95446be656b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.176228 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-socket-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.176252 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9877d0f7-2431-42fb-b4aa-5d612cdb417f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nbqqm\" (UID: \"9877d0f7-2431-42fb-b4aa-5d612cdb417f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.176271 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1bd509b-00aa-4e06-88fd-6849dbeab980-apiservice-cert\") pod \"packageserver-d55dfcdfc-lq6ks\" (UID: \"e1bd509b-00aa-4e06-88fd-6849dbeab980\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.176373 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-etcd-ca\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.176404 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e1bd509b-00aa-4e06-88fd-6849dbeab980-tmpfs\") pod \"packageserver-d55dfcdfc-lq6ks\" (UID: \"e1bd509b-00aa-4e06-88fd-6849dbeab980\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.176424 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbpcp\" (UniqueName: \"kubernetes.io/projected/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-kube-api-access-xbpcp\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.176466 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24b7d367-f4b1-4ba9-b1f0-77e706c85ac4-metrics-tls\") pod \"dns-default-9nxks\" (UID: \"24b7d367-f4b1-4ba9-b1f0-77e706c85ac4\") " pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.176497 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f68050b-a4cf-4dd6-bfaa-6ea9674af578-config\") pod \"service-ca-operator-777779d784-lzn9r\" (UID: \"7f68050b-a4cf-4dd6-bfaa-6ea9674af578\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.177206 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9877d0f7-2431-42fb-b4aa-5d612cdb417f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nbqqm\" (UID: \"9877d0f7-2431-42fb-b4aa-5d612cdb417f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.177245 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc4qz\" (UniqueName: \"kubernetes.io/projected/4af307fe-9280-424c-8cba-6f63aead910b-kube-api-access-hc4qz\") pod \"machine-config-operator-74547568cd-lct7m\" (UID: \"4af307fe-9280-424c-8cba-6f63aead910b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.177270 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02f0b661-1f67-4d17-ae49-2c0c703a50ee-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6g7qx\" (UID: \"02f0b661-1f67-4d17-ae49-2c0c703a50ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.177412 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4af307fe-9280-424c-8cba-6f63aead910b-proxy-tls\") pod \"machine-config-operator-74547568cd-lct7m\" (UID: \"4af307fe-9280-424c-8cba-6f63aead910b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.177438 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhxt7\" (UniqueName: \"kubernetes.io/projected/1821f7df-bab1-4667-aba0-e5ccd51d187e-kube-api-access-rhxt7\") pod \"kube-storage-version-migrator-operator-b67b599dd-ptmj5\" (UID: \"1821f7df-bab1-4667-aba0-e5ccd51d187e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.177824 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5ps6\" (UniqueName: \"kubernetes.io/projected/137e3b24-3531-43e3-8bdd-17dda6d83922-kube-api-access-v5ps6\") pod \"ingress-operator-5b745b69d9-hzl7x\" (UID: \"137e3b24-3531-43e3-8bdd-17dda6d83922\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.177853 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33d9fca8-faa3-43de-aee2-25ca05c03ab2-auth-proxy-config\") pod \"machine-approver-56656f9798-6zk95\" (UID: \"33d9fca8-faa3-43de-aee2-25ca05c03ab2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.177875 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9877d0f7-2431-42fb-b4aa-5d612cdb417f-config\") pod \"kube-controller-manager-operator-78b949d7b-nbqqm\" (UID: \"9877d0f7-2431-42fb-b4aa-5d612cdb417f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.177949 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1821f7df-bab1-4667-aba0-e5ccd51d187e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ptmj5\" (UID: \"1821f7df-bab1-4667-aba0-e5ccd51d187e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.177973 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7760a6d-1e1d-413f-9fbe-e89e4d621ff6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lv2fp\" (UID: \"f7760a6d-1e1d-413f-9fbe-e89e4d621ff6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.178281 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78dnw\" (UniqueName: \"kubernetes.io/projected/24b7d367-f4b1-4ba9-b1f0-77e706c85ac4-kube-api-access-78dnw\") pod \"dns-default-9nxks\" (UID: \"24b7d367-f4b1-4ba9-b1f0-77e706c85ac4\") " pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.178380 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gpf5\" (UniqueName: \"kubernetes.io/projected/bde59828-827b-4873-b51d-34038c9ab9ca-kube-api-access-6gpf5\") pod \"downloads-7954f5f757-nxbxt\" (UID: \"bde59828-827b-4873-b51d-34038c9ab9ca\") " pod="openshift-console/downloads-7954f5f757-nxbxt" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.178411 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/33d9fca8-faa3-43de-aee2-25ca05c03ab2-machine-approver-tls\") pod \"machine-approver-56656f9798-6zk95\" (UID: \"33d9fca8-faa3-43de-aee2-25ca05c03ab2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.178455 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3642df91-ce0a-494a-a3a6-58c5ae92f69a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2d9tn\" (UID: \"3642df91-ce0a-494a-a3a6-58c5ae92f69a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.178484 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/137e3b24-3531-43e3-8bdd-17dda6d83922-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hzl7x\" (UID: \"137e3b24-3531-43e3-8bdd-17dda6d83922\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.179529 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shm8r\" (UniqueName: \"kubernetes.io/projected/e1bd509b-00aa-4e06-88fd-6849dbeab980-kube-api-access-shm8r\") pod \"packageserver-d55dfcdfc-lq6ks\" (UID: \"e1bd509b-00aa-4e06-88fd-6849dbeab980\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.179580 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20560fe2-fd64-4aa1-9d9c-0f0046a10141-metrics-certs\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.180262 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/137e3b24-3531-43e3-8bdd-17dda6d83922-trusted-ca\") pod \"ingress-operator-5b745b69d9-hzl7x\" (UID: \"137e3b24-3531-43e3-8bdd-17dda6d83922\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.185278 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx4pg\" (UniqueName: \"kubernetes.io/projected/2eff5942-a2f5-4b2e-8a8a-d1555ffe952d-kube-api-access-kx4pg\") pod \"console-operator-58897d9998-l2z7h\" (UID: \"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d\") " pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.185448 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-registration-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.185490 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qktlr\" (UniqueName: \"kubernetes.io/projected/f65e5c1d-a9f2-4954-b72e-27f2d2895ac0-kube-api-access-qktlr\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rjt\" (UID: \"f65e5c1d-a9f2-4954-b72e-27f2d2895ac0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.185670 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1bd509b-00aa-4e06-88fd-6849dbeab980-webhook-cert\") pod \"packageserver-d55dfcdfc-lq6ks\" (UID: \"e1bd509b-00aa-4e06-88fd-6849dbeab980\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.186734 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20560fe2-fd64-4aa1-9d9c-0f0046a10141-service-ca-bundle\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.186826 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f68050b-a4cf-4dd6-bfaa-6ea9674af578-serving-cert\") pod \"service-ca-operator-777779d784-lzn9r\" (UID: \"7f68050b-a4cf-4dd6-bfaa-6ea9674af578\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.186930 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-etcd-service-ca\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.187055 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7760a6d-1e1d-413f-9fbe-e89e4d621ff6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lv2fp\" (UID: \"f7760a6d-1e1d-413f-9fbe-e89e4d621ff6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.187123 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-mountpoint-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.187158 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d9fca8-faa3-43de-aee2-25ca05c03ab2-config\") pod \"machine-approver-56656f9798-6zk95\" (UID: \"33d9fca8-faa3-43de-aee2-25ca05c03ab2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.187183 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/adf23d88-9ae5-46c1-bb4b-fa513ab5beb6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4pctg\" (UID: \"adf23d88-9ae5-46c1-bb4b-fa513ab5beb6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4pctg" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.187225 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-config\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.187263 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1821f7df-bab1-4667-aba0-e5ccd51d187e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ptmj5\" (UID: \"1821f7df-bab1-4667-aba0-e5ccd51d187e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.187288 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d3960fea-a405-4e1e-b9a8-14c574ad45e8-certs\") pod \"machine-config-server-mwdsr\" (UID: \"d3960fea-a405-4e1e-b9a8-14c574ad45e8\") " pod="openshift-machine-config-operator/machine-config-server-mwdsr" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.187330 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsvnz\" (UniqueName: \"kubernetes.io/projected/6603119d-4075-47ce-9cc8-4d030353dffa-kube-api-access-qsvnz\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.187355 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-etcd-client\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.187392 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.187415 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dck6\" (UniqueName: \"kubernetes.io/projected/20802e3a-7d2a-43a1-83cb-e50b6ed66d96-kube-api-access-9dck6\") pod \"ingress-canary-w8qs2\" (UID: \"20802e3a-7d2a-43a1-83cb-e50b6ed66d96\") " pod="openshift-ingress-canary/ingress-canary-w8qs2" Oct 06 08:21:41 crc kubenswrapper[4991]: E1006 08:21:41.187762 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.687742836 +0000 UTC m=+153.425492857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.187996 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lns26\" (UniqueName: \"kubernetes.io/projected/33d9fca8-faa3-43de-aee2-25ca05c03ab2-kube-api-access-lns26\") pod \"machine-approver-56656f9798-6zk95\" (UID: \"33d9fca8-faa3-43de-aee2-25ca05c03ab2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.188120 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jql54\" (UniqueName: \"kubernetes.io/projected/d3960fea-a405-4e1e-b9a8-14c574ad45e8-kube-api-access-jql54\") pod \"machine-config-server-mwdsr\" (UID: \"d3960fea-a405-4e1e-b9a8-14c574ad45e8\") " pod="openshift-machine-config-operator/machine-config-server-mwdsr" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.188198 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4af307fe-9280-424c-8cba-6f63aead910b-images\") pod \"machine-config-operator-74547568cd-lct7m\" (UID: \"4af307fe-9280-424c-8cba-6f63aead910b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.188238 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whf4s\" (UniqueName: \"kubernetes.io/projected/ce698484-23c3-49bc-94f8-6a5fc2efccf5-kube-api-access-whf4s\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6t2r\" (UID: \"ce698484-23c3-49bc-94f8-6a5fc2efccf5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.188344 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqln7\" (UniqueName: \"kubernetes.io/projected/adf23d88-9ae5-46c1-bb4b-fa513ab5beb6-kube-api-access-hqln7\") pod \"multus-admission-controller-857f4d67dd-4pctg\" (UID: \"adf23d88-9ae5-46c1-bb4b-fa513ab5beb6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4pctg" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.188425 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-csi-data-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.259013 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.270630 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8zvhl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.290681 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.290983 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2eff5942-a2f5-4b2e-8a8a-d1555ffe952d-trusted-ca\") pod \"console-operator-58897d9998-l2z7h\" (UID: \"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d\") " pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291032 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rnbs\" (UniqueName: \"kubernetes.io/projected/0a7333dc-b6d2-4513-8574-a95446be656b-kube-api-access-4rnbs\") pod \"marketplace-operator-79b997595-p5tk4\" (UID: \"0a7333dc-b6d2-4513-8574-a95446be656b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291055 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v8s7\" (UniqueName: \"kubernetes.io/projected/3642df91-ce0a-494a-a3a6-58c5ae92f69a-kube-api-access-9v8s7\") pod \"package-server-manager-789f6589d5-2d9tn\" (UID: \"3642df91-ce0a-494a-a3a6-58c5ae92f69a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291075 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-plugins-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291092 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eff5942-a2f5-4b2e-8a8a-d1555ffe952d-config\") pod \"console-operator-58897d9998-l2z7h\" (UID: \"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d\") " pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291132 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/20560fe2-fd64-4aa1-9d9c-0f0046a10141-stats-auth\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291158 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a7333dc-b6d2-4513-8574-a95446be656b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p5tk4\" (UID: \"0a7333dc-b6d2-4513-8574-a95446be656b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291172 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eff5942-a2f5-4b2e-8a8a-d1555ffe952d-serving-cert\") pod \"console-operator-58897d9998-l2z7h\" (UID: \"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d\") " pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291196 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh6x9\" (UniqueName: \"kubernetes.io/projected/20560fe2-fd64-4aa1-9d9c-0f0046a10141-kube-api-access-jh6x9\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291214 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce698484-23c3-49bc-94f8-6a5fc2efccf5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6t2r\" (UID: \"ce698484-23c3-49bc-94f8-6a5fc2efccf5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291231 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4af307fe-9280-424c-8cba-6f63aead910b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lct7m\" (UID: \"4af307fe-9280-424c-8cba-6f63aead910b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291261 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/137e3b24-3531-43e3-8bdd-17dda6d83922-metrics-tls\") pod \"ingress-operator-5b745b69d9-hzl7x\" (UID: \"137e3b24-3531-43e3-8bdd-17dda6d83922\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291282 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24b7d367-f4b1-4ba9-b1f0-77e706c85ac4-config-volume\") pod \"dns-default-9nxks\" (UID: \"24b7d367-f4b1-4ba9-b1f0-77e706c85ac4\") " pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291321 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d3960fea-a405-4e1e-b9a8-14c574ad45e8-node-bootstrap-token\") pod \"machine-config-server-mwdsr\" (UID: \"d3960fea-a405-4e1e-b9a8-14c574ad45e8\") " pod="openshift-machine-config-operator/machine-config-server-mwdsr" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291340 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/20560fe2-fd64-4aa1-9d9c-0f0046a10141-default-certificate\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291359 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgxsg\" (UniqueName: \"kubernetes.io/projected/7f68050b-a4cf-4dd6-bfaa-6ea9674af578-kube-api-access-wgxsg\") pod \"service-ca-operator-777779d784-lzn9r\" (UID: \"7f68050b-a4cf-4dd6-bfaa-6ea9674af578\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291375 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-serving-cert\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291394 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02f0b661-1f67-4d17-ae49-2c0c703a50ee-config\") pod \"kube-apiserver-operator-766d6c64bb-6g7qx\" (UID: \"02f0b661-1f67-4d17-ae49-2c0c703a50ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291410 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02f0b661-1f67-4d17-ae49-2c0c703a50ee-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6g7qx\" (UID: \"02f0b661-1f67-4d17-ae49-2c0c703a50ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291429 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f65e5c1d-a9f2-4954-b72e-27f2d2895ac0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rjt\" (UID: \"f65e5c1d-a9f2-4954-b72e-27f2d2895ac0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291450 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7760a6d-1e1d-413f-9fbe-e89e4d621ff6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lv2fp\" (UID: \"f7760a6d-1e1d-413f-9fbe-e89e4d621ff6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291467 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a7333dc-b6d2-4513-8574-a95446be656b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p5tk4\" (UID: \"0a7333dc-b6d2-4513-8574-a95446be656b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291495 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-socket-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291510 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9877d0f7-2431-42fb-b4aa-5d612cdb417f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nbqqm\" (UID: \"9877d0f7-2431-42fb-b4aa-5d612cdb417f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291538 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1bd509b-00aa-4e06-88fd-6849dbeab980-apiservice-cert\") pod \"packageserver-d55dfcdfc-lq6ks\" (UID: \"e1bd509b-00aa-4e06-88fd-6849dbeab980\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291559 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-etcd-ca\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291586 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e1bd509b-00aa-4e06-88fd-6849dbeab980-tmpfs\") pod \"packageserver-d55dfcdfc-lq6ks\" (UID: \"e1bd509b-00aa-4e06-88fd-6849dbeab980\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291605 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbpcp\" (UniqueName: \"kubernetes.io/projected/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-kube-api-access-xbpcp\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291633 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24b7d367-f4b1-4ba9-b1f0-77e706c85ac4-metrics-tls\") pod \"dns-default-9nxks\" (UID: \"24b7d367-f4b1-4ba9-b1f0-77e706c85ac4\") " pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291648 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f68050b-a4cf-4dd6-bfaa-6ea9674af578-config\") pod \"service-ca-operator-777779d784-lzn9r\" (UID: \"7f68050b-a4cf-4dd6-bfaa-6ea9674af578\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291669 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc4qz\" (UniqueName: \"kubernetes.io/projected/4af307fe-9280-424c-8cba-6f63aead910b-kube-api-access-hc4qz\") pod \"machine-config-operator-74547568cd-lct7m\" (UID: \"4af307fe-9280-424c-8cba-6f63aead910b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291685 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02f0b661-1f67-4d17-ae49-2c0c703a50ee-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6g7qx\" (UID: \"02f0b661-1f67-4d17-ae49-2c0c703a50ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291701 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9877d0f7-2431-42fb-b4aa-5d612cdb417f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nbqqm\" (UID: \"9877d0f7-2431-42fb-b4aa-5d612cdb417f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291718 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxt7\" (UniqueName: \"kubernetes.io/projected/1821f7df-bab1-4667-aba0-e5ccd51d187e-kube-api-access-rhxt7\") pod \"kube-storage-version-migrator-operator-b67b599dd-ptmj5\" (UID: \"1821f7df-bab1-4667-aba0-e5ccd51d187e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291747 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5ps6\" (UniqueName: \"kubernetes.io/projected/137e3b24-3531-43e3-8bdd-17dda6d83922-kube-api-access-v5ps6\") pod \"ingress-operator-5b745b69d9-hzl7x\" (UID: \"137e3b24-3531-43e3-8bdd-17dda6d83922\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291765 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4af307fe-9280-424c-8cba-6f63aead910b-proxy-tls\") pod \"machine-config-operator-74547568cd-lct7m\" (UID: \"4af307fe-9280-424c-8cba-6f63aead910b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291790 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33d9fca8-faa3-43de-aee2-25ca05c03ab2-auth-proxy-config\") pod \"machine-approver-56656f9798-6zk95\" (UID: \"33d9fca8-faa3-43de-aee2-25ca05c03ab2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291805 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9877d0f7-2431-42fb-b4aa-5d612cdb417f-config\") pod \"kube-controller-manager-operator-78b949d7b-nbqqm\" (UID: \"9877d0f7-2431-42fb-b4aa-5d612cdb417f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291823 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1821f7df-bab1-4667-aba0-e5ccd51d187e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ptmj5\" (UID: \"1821f7df-bab1-4667-aba0-e5ccd51d187e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291841 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7760a6d-1e1d-413f-9fbe-e89e4d621ff6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lv2fp\" (UID: \"f7760a6d-1e1d-413f-9fbe-e89e4d621ff6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291859 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78dnw\" (UniqueName: \"kubernetes.io/projected/24b7d367-f4b1-4ba9-b1f0-77e706c85ac4-kube-api-access-78dnw\") pod \"dns-default-9nxks\" (UID: \"24b7d367-f4b1-4ba9-b1f0-77e706c85ac4\") " pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291883 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gpf5\" (UniqueName: \"kubernetes.io/projected/bde59828-827b-4873-b51d-34038c9ab9ca-kube-api-access-6gpf5\") pod \"downloads-7954f5f757-nxbxt\" (UID: \"bde59828-827b-4873-b51d-34038c9ab9ca\") " pod="openshift-console/downloads-7954f5f757-nxbxt" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291901 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/33d9fca8-faa3-43de-aee2-25ca05c03ab2-machine-approver-tls\") pod \"machine-approver-56656f9798-6zk95\" (UID: \"33d9fca8-faa3-43de-aee2-25ca05c03ab2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291919 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3642df91-ce0a-494a-a3a6-58c5ae92f69a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2d9tn\" (UID: \"3642df91-ce0a-494a-a3a6-58c5ae92f69a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291948 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/137e3b24-3531-43e3-8bdd-17dda6d83922-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hzl7x\" (UID: \"137e3b24-3531-43e3-8bdd-17dda6d83922\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291980 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20560fe2-fd64-4aa1-9d9c-0f0046a10141-metrics-certs\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.291998 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shm8r\" (UniqueName: \"kubernetes.io/projected/e1bd509b-00aa-4e06-88fd-6849dbeab980-kube-api-access-shm8r\") pod \"packageserver-d55dfcdfc-lq6ks\" (UID: \"e1bd509b-00aa-4e06-88fd-6849dbeab980\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292016 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/137e3b24-3531-43e3-8bdd-17dda6d83922-trusted-ca\") pod \"ingress-operator-5b745b69d9-hzl7x\" (UID: \"137e3b24-3531-43e3-8bdd-17dda6d83922\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292038 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx4pg\" (UniqueName: \"kubernetes.io/projected/2eff5942-a2f5-4b2e-8a8a-d1555ffe952d-kube-api-access-kx4pg\") pod \"console-operator-58897d9998-l2z7h\" (UID: \"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d\") " pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292054 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-registration-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292070 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qktlr\" (UniqueName: \"kubernetes.io/projected/f65e5c1d-a9f2-4954-b72e-27f2d2895ac0-kube-api-access-qktlr\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rjt\" (UID: \"f65e5c1d-a9f2-4954-b72e-27f2d2895ac0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292087 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1bd509b-00aa-4e06-88fd-6849dbeab980-webhook-cert\") pod \"packageserver-d55dfcdfc-lq6ks\" (UID: \"e1bd509b-00aa-4e06-88fd-6849dbeab980\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292113 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f68050b-a4cf-4dd6-bfaa-6ea9674af578-serving-cert\") pod \"service-ca-operator-777779d784-lzn9r\" (UID: \"7f68050b-a4cf-4dd6-bfaa-6ea9674af578\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292144 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20560fe2-fd64-4aa1-9d9c-0f0046a10141-service-ca-bundle\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292165 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-etcd-service-ca\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292182 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7760a6d-1e1d-413f-9fbe-e89e4d621ff6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lv2fp\" (UID: \"f7760a6d-1e1d-413f-9fbe-e89e4d621ff6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292200 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d9fca8-faa3-43de-aee2-25ca05c03ab2-config\") pod \"machine-approver-56656f9798-6zk95\" (UID: \"33d9fca8-faa3-43de-aee2-25ca05c03ab2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292219 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/adf23d88-9ae5-46c1-bb4b-fa513ab5beb6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4pctg\" (UID: \"adf23d88-9ae5-46c1-bb4b-fa513ab5beb6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4pctg" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292235 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-config\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292252 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1821f7df-bab1-4667-aba0-e5ccd51d187e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ptmj5\" (UID: \"1821f7df-bab1-4667-aba0-e5ccd51d187e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292267 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-mountpoint-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292313 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d3960fea-a405-4e1e-b9a8-14c574ad45e8-certs\") pod \"machine-config-server-mwdsr\" (UID: \"d3960fea-a405-4e1e-b9a8-14c574ad45e8\") " pod="openshift-machine-config-operator/machine-config-server-mwdsr" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292330 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsvnz\" (UniqueName: \"kubernetes.io/projected/6603119d-4075-47ce-9cc8-4d030353dffa-kube-api-access-qsvnz\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292353 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-etcd-client\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292381 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lns26\" (UniqueName: \"kubernetes.io/projected/33d9fca8-faa3-43de-aee2-25ca05c03ab2-kube-api-access-lns26\") pod \"machine-approver-56656f9798-6zk95\" (UID: \"33d9fca8-faa3-43de-aee2-25ca05c03ab2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292398 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jql54\" (UniqueName: \"kubernetes.io/projected/d3960fea-a405-4e1e-b9a8-14c574ad45e8-kube-api-access-jql54\") pod \"machine-config-server-mwdsr\" (UID: \"d3960fea-a405-4e1e-b9a8-14c574ad45e8\") " pod="openshift-machine-config-operator/machine-config-server-mwdsr" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292418 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dck6\" (UniqueName: \"kubernetes.io/projected/20802e3a-7d2a-43a1-83cb-e50b6ed66d96-kube-api-access-9dck6\") pod \"ingress-canary-w8qs2\" (UID: \"20802e3a-7d2a-43a1-83cb-e50b6ed66d96\") " pod="openshift-ingress-canary/ingress-canary-w8qs2" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292438 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4af307fe-9280-424c-8cba-6f63aead910b-images\") pod \"machine-config-operator-74547568cd-lct7m\" (UID: \"4af307fe-9280-424c-8cba-6f63aead910b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292455 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whf4s\" (UniqueName: \"kubernetes.io/projected/ce698484-23c3-49bc-94f8-6a5fc2efccf5-kube-api-access-whf4s\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6t2r\" (UID: \"ce698484-23c3-49bc-94f8-6a5fc2efccf5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292512 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-csi-data-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292529 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqln7\" (UniqueName: \"kubernetes.io/projected/adf23d88-9ae5-46c1-bb4b-fa513ab5beb6-kube-api-access-hqln7\") pod \"multus-admission-controller-857f4d67dd-4pctg\" (UID: \"adf23d88-9ae5-46c1-bb4b-fa513ab5beb6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4pctg" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292549 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce698484-23c3-49bc-94f8-6a5fc2efccf5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6t2r\" (UID: \"ce698484-23c3-49bc-94f8-6a5fc2efccf5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.292574 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20802e3a-7d2a-43a1-83cb-e50b6ed66d96-cert\") pod \"ingress-canary-w8qs2\" (UID: \"20802e3a-7d2a-43a1-83cb-e50b6ed66d96\") " pod="openshift-ingress-canary/ingress-canary-w8qs2" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.303567 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20802e3a-7d2a-43a1-83cb-e50b6ed66d96-cert\") pod \"ingress-canary-w8qs2\" (UID: \"20802e3a-7d2a-43a1-83cb-e50b6ed66d96\") " pod="openshift-ingress-canary/ingress-canary-w8qs2" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.306592 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9877d0f7-2431-42fb-b4aa-5d612cdb417f-config\") pod \"kube-controller-manager-operator-78b949d7b-nbqqm\" (UID: \"9877d0f7-2431-42fb-b4aa-5d612cdb417f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.308087 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7760a6d-1e1d-413f-9fbe-e89e4d621ff6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lv2fp\" (UID: \"f7760a6d-1e1d-413f-9fbe-e89e4d621ff6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.309004 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02f0b661-1f67-4d17-ae49-2c0c703a50ee-config\") pod \"kube-apiserver-operator-766d6c64bb-6g7qx\" (UID: \"02f0b661-1f67-4d17-ae49-2c0c703a50ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.309766 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33d9fca8-faa3-43de-aee2-25ca05c03ab2-auth-proxy-config\") pod \"machine-approver-56656f9798-6zk95\" (UID: \"33d9fca8-faa3-43de-aee2-25ca05c03ab2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: E1006 08:21:41.317458 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.817434056 +0000 UTC m=+153.555184067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.318740 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-socket-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.319127 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eff5942-a2f5-4b2e-8a8a-d1555ffe952d-config\") pod \"console-operator-58897d9998-l2z7h\" (UID: \"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d\") " pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.319252 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e1bd509b-00aa-4e06-88fd-6849dbeab980-tmpfs\") pod \"packageserver-d55dfcdfc-lq6ks\" (UID: \"e1bd509b-00aa-4e06-88fd-6849dbeab980\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.319356 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-plugins-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.320149 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2eff5942-a2f5-4b2e-8a8a-d1555ffe952d-trusted-ca\") pod \"console-operator-58897d9998-l2z7h\" (UID: \"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d\") " pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.320919 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-etcd-ca\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.321023 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f68050b-a4cf-4dd6-bfaa-6ea9674af578-config\") pod \"service-ca-operator-777779d784-lzn9r\" (UID: \"7f68050b-a4cf-4dd6-bfaa-6ea9674af578\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.321680 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1821f7df-bab1-4667-aba0-e5ccd51d187e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ptmj5\" (UID: \"1821f7df-bab1-4667-aba0-e5ccd51d187e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.322070 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-etcd-service-ca\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.323109 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/137e3b24-3531-43e3-8bdd-17dda6d83922-trusted-ca\") pod \"ingress-operator-5b745b69d9-hzl7x\" (UID: \"137e3b24-3531-43e3-8bdd-17dda6d83922\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.323776 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-csi-data-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.325206 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce698484-23c3-49bc-94f8-6a5fc2efccf5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6t2r\" (UID: \"ce698484-23c3-49bc-94f8-6a5fc2efccf5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.325537 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a7333dc-b6d2-4513-8574-a95446be656b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p5tk4\" (UID: \"0a7333dc-b6d2-4513-8574-a95446be656b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.326102 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4af307fe-9280-424c-8cba-6f63aead910b-images\") pod \"machine-config-operator-74547568cd-lct7m\" (UID: \"4af307fe-9280-424c-8cba-6f63aead910b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.326266 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1821f7df-bab1-4667-aba0-e5ccd51d187e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ptmj5\" (UID: \"1821f7df-bab1-4667-aba0-e5ccd51d187e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.329655 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24b7d367-f4b1-4ba9-b1f0-77e706c85ac4-config-volume\") pod \"dns-default-9nxks\" (UID: \"24b7d367-f4b1-4ba9-b1f0-77e706c85ac4\") " pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.330563 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4af307fe-9280-424c-8cba-6f63aead910b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lct7m\" (UID: \"4af307fe-9280-424c-8cba-6f63aead910b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.330912 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-config\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.331084 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-registration-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.331726 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20560fe2-fd64-4aa1-9d9c-0f0046a10141-service-ca-bundle\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.331906 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6603119d-4075-47ce-9cc8-4d030353dffa-mountpoint-dir\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.332429 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33d9fca8-faa3-43de-aee2-25ca05c03ab2-config\") pod \"machine-approver-56656f9798-6zk95\" (UID: \"33d9fca8-faa3-43de-aee2-25ca05c03ab2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.339762 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/20560fe2-fd64-4aa1-9d9c-0f0046a10141-stats-auth\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.339934 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02f0b661-1f67-4d17-ae49-2c0c703a50ee-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6g7qx\" (UID: \"02f0b661-1f67-4d17-ae49-2c0c703a50ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.340061 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.349358 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78dnw\" (UniqueName: \"kubernetes.io/projected/24b7d367-f4b1-4ba9-b1f0-77e706c85ac4-kube-api-access-78dnw\") pod \"dns-default-9nxks\" (UID: \"24b7d367-f4b1-4ba9-b1f0-77e706c85ac4\") " pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.349389 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1bd509b-00aa-4e06-88fd-6849dbeab980-webhook-cert\") pod \"packageserver-d55dfcdfc-lq6ks\" (UID: \"e1bd509b-00aa-4e06-88fd-6849dbeab980\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.349471 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-serving-cert\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.357953 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/33d9fca8-faa3-43de-aee2-25ca05c03ab2-machine-approver-tls\") pod \"machine-approver-56656f9798-6zk95\" (UID: \"33d9fca8-faa3-43de-aee2-25ca05c03ab2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.357981 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f68050b-a4cf-4dd6-bfaa-6ea9674af578-serving-cert\") pod \"service-ca-operator-777779d784-lzn9r\" (UID: \"7f68050b-a4cf-4dd6-bfaa-6ea9674af578\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.358104 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d3960fea-a405-4e1e-b9a8-14c574ad45e8-certs\") pod \"machine-config-server-mwdsr\" (UID: \"d3960fea-a405-4e1e-b9a8-14c574ad45e8\") " pod="openshift-machine-config-operator/machine-config-server-mwdsr" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.358273 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24b7d367-f4b1-4ba9-b1f0-77e706c85ac4-metrics-tls\") pod \"dns-default-9nxks\" (UID: \"24b7d367-f4b1-4ba9-b1f0-77e706c85ac4\") " pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.358495 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f65e5c1d-a9f2-4954-b72e-27f2d2895ac0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rjt\" (UID: \"f65e5c1d-a9f2-4954-b72e-27f2d2895ac0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.358678 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-etcd-client\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.358805 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1bd509b-00aa-4e06-88fd-6849dbeab980-apiservice-cert\") pod \"packageserver-d55dfcdfc-lq6ks\" (UID: \"e1bd509b-00aa-4e06-88fd-6849dbeab980\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.359056 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d3960fea-a405-4e1e-b9a8-14c574ad45e8-node-bootstrap-token\") pod \"machine-config-server-mwdsr\" (UID: \"d3960fea-a405-4e1e-b9a8-14c574ad45e8\") " pod="openshift-machine-config-operator/machine-config-server-mwdsr" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.359141 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/20560fe2-fd64-4aa1-9d9c-0f0046a10141-default-certificate\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.360148 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a7333dc-b6d2-4513-8574-a95446be656b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p5tk4\" (UID: \"0a7333dc-b6d2-4513-8574-a95446be656b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.360253 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/137e3b24-3531-43e3-8bdd-17dda6d83922-metrics-tls\") pod \"ingress-operator-5b745b69d9-hzl7x\" (UID: \"137e3b24-3531-43e3-8bdd-17dda6d83922\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.360476 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/adf23d88-9ae5-46c1-bb4b-fa513ab5beb6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4pctg\" (UID: \"adf23d88-9ae5-46c1-bb4b-fa513ab5beb6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4pctg" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.361874 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2eff5942-a2f5-4b2e-8a8a-d1555ffe952d-serving-cert\") pod \"console-operator-58897d9998-l2z7h\" (UID: \"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d\") " pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.364169 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7760a6d-1e1d-413f-9fbe-e89e4d621ff6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lv2fp\" (UID: \"f7760a6d-1e1d-413f-9fbe-e89e4d621ff6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.364845 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9877d0f7-2431-42fb-b4aa-5d612cdb417f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nbqqm\" (UID: \"9877d0f7-2431-42fb-b4aa-5d612cdb417f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.366040 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4af307fe-9280-424c-8cba-6f63aead910b-proxy-tls\") pod \"machine-config-operator-74547568cd-lct7m\" (UID: \"4af307fe-9280-424c-8cba-6f63aead910b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.367915 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce698484-23c3-49bc-94f8-6a5fc2efccf5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6t2r\" (UID: \"ce698484-23c3-49bc-94f8-6a5fc2efccf5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.368359 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20560fe2-fd64-4aa1-9d9c-0f0046a10141-metrics-certs\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.368607 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3642df91-ce0a-494a-a3a6-58c5ae92f69a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2d9tn\" (UID: \"3642df91-ce0a-494a-a3a6-58c5ae92f69a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.382210 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gpf5\" (UniqueName: \"kubernetes.io/projected/bde59828-827b-4873-b51d-34038c9ab9ca-kube-api-access-6gpf5\") pod \"downloads-7954f5f757-nxbxt\" (UID: \"bde59828-827b-4873-b51d-34038c9ab9ca\") " pod="openshift-console/downloads-7954f5f757-nxbxt" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.386378 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbpcp\" (UniqueName: \"kubernetes.io/projected/3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1-kube-api-access-xbpcp\") pod \"etcd-operator-b45778765-hbwft\" (UID: \"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.405960 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rnbs\" (UniqueName: \"kubernetes.io/projected/0a7333dc-b6d2-4513-8574-a95446be656b-kube-api-access-4rnbs\") pod \"marketplace-operator-79b997595-p5tk4\" (UID: \"0a7333dc-b6d2-4513-8574-a95446be656b\") " pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.422409 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:41 crc kubenswrapper[4991]: E1006 08:21:41.423056 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:41.923038953 +0000 UTC m=+153.660788974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.425339 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v8s7\" (UniqueName: \"kubernetes.io/projected/3642df91-ce0a-494a-a3a6-58c5ae92f69a-kube-api-access-9v8s7\") pod \"package-server-manager-789f6589d5-2d9tn\" (UID: \"3642df91-ce0a-494a-a3a6-58c5ae92f69a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.460548 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7760a6d-1e1d-413f-9fbe-e89e4d621ff6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lv2fp\" (UID: \"f7760a6d-1e1d-413f-9fbe-e89e4d621ff6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.489423 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jql54\" (UniqueName: \"kubernetes.io/projected/d3960fea-a405-4e1e-b9a8-14c574ad45e8-kube-api-access-jql54\") pod \"machine-config-server-mwdsr\" (UID: \"d3960fea-a405-4e1e-b9a8-14c574ad45e8\") " pod="openshift-machine-config-operator/machine-config-server-mwdsr" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.491570 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9877d0f7-2431-42fb-b4aa-5d612cdb417f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nbqqm\" (UID: \"9877d0f7-2431-42fb-b4aa-5d612cdb417f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.493538 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.510063 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.513358 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mwdsr" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.517875 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whf4s\" (UniqueName: \"kubernetes.io/projected/ce698484-23c3-49bc-94f8-6a5fc2efccf5-kube-api-access-whf4s\") pod \"openshift-controller-manager-operator-756b6f6bc6-t6t2r\" (UID: \"ce698484-23c3-49bc-94f8-6a5fc2efccf5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.539101 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.539856 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:41 crc kubenswrapper[4991]: E1006 08:21:41.540365 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:42.040350632 +0000 UTC m=+153.778100653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.542328 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.550785 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dck6\" (UniqueName: \"kubernetes.io/projected/20802e3a-7d2a-43a1-83cb-e50b6ed66d96-kube-api-access-9dck6\") pod \"ingress-canary-w8qs2\" (UID: \"20802e3a-7d2a-43a1-83cb-e50b6ed66d96\") " pod="openshift-ingress-canary/ingress-canary-w8qs2" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.550906 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nxbxt" Oct 06 08:21:41 crc kubenswrapper[4991]: W1006 08:21:41.561956 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a3137ff_020a_43c5_867a_9ab59df067ff.slice/crio-3f1ae814714ec7b64b773cc45b8145399ba5f5074b1581516011994e52fa342c WatchSource:0}: Error finding container 3f1ae814714ec7b64b773cc45b8145399ba5f5074b1581516011994e52fa342c: Status 404 returned error can't find the container with id 3f1ae814714ec7b64b773cc45b8145399ba5f5074b1581516011994e52fa342c Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.564196 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.566441 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/137e3b24-3531-43e3-8bdd-17dda6d83922-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hzl7x\" (UID: \"137e3b24-3531-43e3-8bdd-17dda6d83922\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.567282 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5ps6\" (UniqueName: \"kubernetes.io/projected/137e3b24-3531-43e3-8bdd-17dda6d83922-kube-api-access-v5ps6\") pod \"ingress-operator-5b745b69d9-hzl7x\" (UID: \"137e3b24-3531-43e3-8bdd-17dda6d83922\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.581143 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqln7\" (UniqueName: \"kubernetes.io/projected/adf23d88-9ae5-46c1-bb4b-fa513ab5beb6-kube-api-access-hqln7\") pod \"multus-admission-controller-857f4d67dd-4pctg\" (UID: \"adf23d88-9ae5-46c1-bb4b-fa513ab5beb6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4pctg" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.588156 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.590717 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.610576 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.615244 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgxsg\" (UniqueName: \"kubernetes.io/projected/7f68050b-a4cf-4dd6-bfaa-6ea9674af578-kube-api-access-wgxsg\") pod \"service-ca-operator-777779d784-lzn9r\" (UID: \"7f68050b-a4cf-4dd6-bfaa-6ea9674af578\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.641404 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:41 crc kubenswrapper[4991]: E1006 08:21:41.641881 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:42.141864975 +0000 UTC m=+153.879614996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.645676 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx4pg\" (UniqueName: \"kubernetes.io/projected/2eff5942-a2f5-4b2e-8a8a-d1555ffe952d-kube-api-access-kx4pg\") pod \"console-operator-58897d9998-l2z7h\" (UID: \"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d\") " pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.651575 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qktlr\" (UniqueName: \"kubernetes.io/projected/f65e5c1d-a9f2-4954-b72e-27f2d2895ac0-kube-api-access-qktlr\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rjt\" (UID: \"f65e5c1d-a9f2-4954-b72e-27f2d2895ac0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.671934 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh6x9\" (UniqueName: \"kubernetes.io/projected/20560fe2-fd64-4aa1-9d9c-0f0046a10141-kube-api-access-jh6x9\") pod \"router-default-5444994796-g2mjl\" (UID: \"20560fe2-fd64-4aa1-9d9c-0f0046a10141\") " pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.688867 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsvnz\" (UniqueName: \"kubernetes.io/projected/6603119d-4075-47ce-9cc8-4d030353dffa-kube-api-access-qsvnz\") pod \"csi-hostpathplugin-pgp24\" (UID: \"6603119d-4075-47ce-9cc8-4d030353dffa\") " pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.692939 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4pctg" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.698462 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.703008 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02f0b661-1f67-4d17-ae49-2c0c703a50ee-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6g7qx\" (UID: \"02f0b661-1f67-4d17-ae49-2c0c703a50ee\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.710145 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.724337 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vr6sj"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.733849 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc4qz\" (UniqueName: \"kubernetes.io/projected/4af307fe-9280-424c-8cba-6f63aead910b-kube-api-access-hc4qz\") pod \"machine-config-operator-74547568cd-lct7m\" (UID: \"4af307fe-9280-424c-8cba-6f63aead910b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.745023 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:41 crc kubenswrapper[4991]: E1006 08:21:41.745366 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:42.245338633 +0000 UTC m=+153.983088654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.745405 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:41 crc kubenswrapper[4991]: E1006 08:21:41.746226 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:42.246217707 +0000 UTC m=+153.983967728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.749384 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shm8r\" (UniqueName: \"kubernetes.io/projected/e1bd509b-00aa-4e06-88fd-6849dbeab980-kube-api-access-shm8r\") pod \"packageserver-d55dfcdfc-lq6ks\" (UID: \"e1bd509b-00aa-4e06-88fd-6849dbeab980\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.757066 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.762074 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.765589 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.771688 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhxt7\" (UniqueName: \"kubernetes.io/projected/1821f7df-bab1-4667-aba0-e5ccd51d187e-kube-api-access-rhxt7\") pod \"kube-storage-version-migrator-operator-b67b599dd-ptmj5\" (UID: \"1821f7df-bab1-4667-aba0-e5ccd51d187e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" Oct 06 08:21:41 crc kubenswrapper[4991]: W1006 08:21:41.776820 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a8266da_ca7f_4357_8aa7_86aaa7fb23c6.slice/crio-0019ee2bbb7d41f5a56065c428270e813d828565fdf8c29158229c1423053c65 WatchSource:0}: Error finding container 0019ee2bbb7d41f5a56065c428270e813d828565fdf8c29158229c1423053c65: Status 404 returned error can't find the container with id 0019ee2bbb7d41f5a56065c428270e813d828565fdf8c29158229c1423053c65 Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.777176 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.782463 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lns26\" (UniqueName: \"kubernetes.io/projected/33d9fca8-faa3-43de-aee2-25ca05c03ab2-kube-api-access-lns26\") pod \"machine-approver-56656f9798-6zk95\" (UID: \"33d9fca8-faa3-43de-aee2-25ca05c03ab2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.784584 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.794083 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vtcb6"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.797702 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.801479 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w8qs2" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.811043 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.817076 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.826666 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.838916 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pgp24" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.846212 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:41 crc kubenswrapper[4991]: E1006 08:21:41.852472 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:42.35243881 +0000 UTC m=+154.090188831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.877117 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.901900 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.903927 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vxbrw"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.919138 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.936609 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tgxlv"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.947174 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.954042 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:41 crc kubenswrapper[4991]: E1006 08:21:41.954504 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:42.454486828 +0000 UTC m=+154.192236849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.971501 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nghng"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.973265 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z"] Oct 06 08:21:41 crc kubenswrapper[4991]: I1006 08:21:41.976899 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lcvbr"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.035036 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.041437 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8zvhl"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.054847 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kp5gc"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.055962 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:42 crc kubenswrapper[4991]: E1006 08:21:42.056545 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:42.556527205 +0000 UTC m=+154.294277226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.069049 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nxbxt"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.070068 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp"] Oct 06 08:21:42 crc kubenswrapper[4991]: W1006 08:21:42.126066 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4ef6468_c4e0_4a26_820b_ddd444b50a07.slice/crio-39b9f7887a1e5bc1b35e009576c3819df509ca66efa1ddd790caff9103c6f59c WatchSource:0}: Error finding container 39b9f7887a1e5bc1b35e009576c3819df509ca66efa1ddd790caff9103c6f59c: Status 404 returned error can't find the container with id 39b9f7887a1e5bc1b35e009576c3819df509ca66efa1ddd790caff9103c6f59c Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.152359 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" event={"ID":"f4ef6468-c4e0-4a26-820b-ddd444b50a07","Type":"ContainerStarted","Data":"39b9f7887a1e5bc1b35e009576c3819df509ca66efa1ddd790caff9103c6f59c"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.155404 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" event={"ID":"1aab780d-af84-45fa-bc9c-b728d4e196d1","Type":"ContainerStarted","Data":"ec5697ed95fa543f6e1d7395b7ab21c28a99c64c2fee9d4f14b1e4db062368d8"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.155437 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" event={"ID":"1aab780d-af84-45fa-bc9c-b728d4e196d1","Type":"ContainerStarted","Data":"ba705eef35c210d8cb9a999a89cdd488581be5e59d74f52c114624d017daae95"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.162590 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" event={"ID":"227e09e3-5d47-4490-9135-0ecb29c18623","Type":"ContainerStarted","Data":"b3876f64f821db8cc0fdd44ba17afb5d0c389263e290ec826c179425d0fb92a8"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.172156 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tgxlv" event={"ID":"c78d976c-800d-4739-bdd4-5b8e5943c0a5","Type":"ContainerStarted","Data":"46d84c08f4afac024f8ea00005ac730e5326e845f26a930531150910bf6a935c"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.173594 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:42 crc kubenswrapper[4991]: E1006 08:21:42.176173 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:42.676153389 +0000 UTC m=+154.413903410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:42 crc kubenswrapper[4991]: W1006 08:21:42.177616 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde59828_827b_4873_b51d_34038c9ab9ca.slice/crio-9337dba7596a3a1191aed3c0043bf609ba67738598712da87e4179b917529298 WatchSource:0}: Error finding container 9337dba7596a3a1191aed3c0043bf609ba67738598712da87e4179b917529298: Status 404 returned error can't find the container with id 9337dba7596a3a1191aed3c0043bf609ba67738598712da87e4179b917529298 Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.180597 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" event={"ID":"9a3137ff-020a-43c5-867a-9ab59df067ff","Type":"ContainerStarted","Data":"4fe5637cf9254debf17f4a314f5cac25e174dc69bddd7e702a87e543ae9ce817"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.180651 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" event={"ID":"9a3137ff-020a-43c5-867a-9ab59df067ff","Type":"ContainerStarted","Data":"3f1ae814714ec7b64b773cc45b8145399ba5f5074b1581516011994e52fa342c"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.229792 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" event={"ID":"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6","Type":"ContainerStarted","Data":"0019ee2bbb7d41f5a56065c428270e813d828565fdf8c29158229c1423053c65"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.231432 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hwgrf" podStartSLOduration=128.231411805 podStartE2EDuration="2m8.231411805s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:42.229233305 +0000 UTC m=+153.966983326" watchObservedRunningTime="2025-10-06 08:21:42.231411805 +0000 UTC m=+153.969161826" Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.237840 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9nxks"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.239868 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.242082 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" event={"ID":"ac30cd53-f61e-4f56-8110-4eacc0aade3f","Type":"ContainerStarted","Data":"08a9a50b6787abdabc4bdff51c2e277727550a5a8ee202a13a9096d828c78b89"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.247027 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" event={"ID":"d3ae516b-0866-40bc-b886-44111fef9329","Type":"ContainerStarted","Data":"3b748053d62629a41635fe3c4ac81820fff7e536427a32ff5bf31adcdacc2490"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.255333 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" event={"ID":"8055779b-d4d8-48ce-bb04-f49073e28dc1","Type":"ContainerStarted","Data":"e89a1713d6808cb2a6875248d788095bbb1a2bae374948fd4d218c7f0276229b"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.275034 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:42 crc kubenswrapper[4991]: E1006 08:21:42.275455 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:42.77541331 +0000 UTC m=+154.513163331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.275565 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.278275 4991 generic.go:334] "Generic (PLEG): container finished" podID="337bd770-81c5-466f-a32e-9fef462765c8" containerID="4654ef0c3872b6a0cefa3f9d4c1582bc1b140aad3575de51564ef54f6871831c" exitCode=0 Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.279377 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" event={"ID":"337bd770-81c5-466f-a32e-9fef462765c8","Type":"ContainerDied","Data":"4654ef0c3872b6a0cefa3f9d4c1582bc1b140aad3575de51564ef54f6871831c"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.279419 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" event={"ID":"337bd770-81c5-466f-a32e-9fef462765c8","Type":"ContainerStarted","Data":"77fab0a8f3188816fefbab56253ff0a73175fc4c3297ad971c76d56a739c0316"} Oct 06 08:21:42 crc kubenswrapper[4991]: E1006 08:21:42.278579 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:42.778560507 +0000 UTC m=+154.516310528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.324764 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" event={"ID":"15148cd6-6d64-4a92-a334-b5014bf8b05a","Type":"ContainerStarted","Data":"1389990a6108879aa898d1496125e481db9802b3cda22e58d802ec547dd3b43c"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.327844 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p5tk4"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.333799 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" event={"ID":"8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7","Type":"ContainerStarted","Data":"8e282b6f55710a63135d72ff0eb0d0d0b30bd558b48dada9ab15c445152f617f"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.344117 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" event={"ID":"0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d","Type":"ContainerStarted","Data":"5a7ceda9ba89c5c8397afbbba85a44e504e1f733292d31fe265731e31303db17"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.378741 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:42 crc kubenswrapper[4991]: E1006 08:21:42.379354 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:42.879336649 +0000 UTC m=+154.617086670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.381628 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hbwft"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.384219 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.388012 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.388235 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-l2z7h"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.391278 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" event={"ID":"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5","Type":"ContainerStarted","Data":"0821942cd53b4cbe0b659484abe9ca4b22e609f6249a4613e7a52e718104cea1"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.393505 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mwdsr" event={"ID":"d3960fea-a405-4e1e-b9a8-14c574ad45e8","Type":"ContainerStarted","Data":"71509a3bfc4e46e0ee5ad09869368a1eebd770087af0565d8d73af7dd2e43ce6"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.394334 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" event={"ID":"8f5f1533-ca00-4377-853e-c5433faa591e","Type":"ContainerStarted","Data":"8a5264ffc130bdb597f129aaa22df75a5dbd2312479564ea106bd73df223eb90"} Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.484171 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:42 crc kubenswrapper[4991]: E1006 08:21:42.484492 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:42.984480353 +0000 UTC m=+154.722230374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:42 crc kubenswrapper[4991]: W1006 08:21:42.559581 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eff5942_a2f5_4b2e_8a8a_d1555ffe952d.slice/crio-0c4b60a52bfed57e7b18f5641a156f6d28128f0d56726385e619d5b9baaf2c25 WatchSource:0}: Error finding container 0c4b60a52bfed57e7b18f5641a156f6d28128f0d56726385e619d5b9baaf2c25: Status 404 returned error can't find the container with id 0c4b60a52bfed57e7b18f5641a156f6d28128f0d56726385e619d5b9baaf2c25 Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.585278 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:42 crc kubenswrapper[4991]: E1006 08:21:42.593137 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:43.093086982 +0000 UTC m=+154.830837013 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.606435 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4pctg"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.686506 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:42 crc kubenswrapper[4991]: E1006 08:21:42.687242 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:43.187216601 +0000 UTC m=+154.924966622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.786685 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.797006 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:42 crc kubenswrapper[4991]: E1006 08:21:42.797481 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:43.297464286 +0000 UTC m=+155.035214307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.834335 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.835898 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.840326 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.844685 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.846348 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m"] Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.848096 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" podStartSLOduration=129.848075943 podStartE2EDuration="2m9.848075943s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:42.841649896 +0000 UTC m=+154.579399917" watchObservedRunningTime="2025-10-06 08:21:42.848075943 +0000 UTC m=+154.585825964" Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.898990 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:42 crc kubenswrapper[4991]: E1006 08:21:42.899407 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:43.39938584 +0000 UTC m=+155.137135861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.942600 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks"] Oct 06 08:21:42 crc kubenswrapper[4991]: W1006 08:21:42.952852 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce698484_23c3_49bc_94f8_6a5fc2efccf5.slice/crio-377181858c138654fe5d57b723f8b3fc63ec37b8536cc799ca4298dd05365efc WatchSource:0}: Error finding container 377181858c138654fe5d57b723f8b3fc63ec37b8536cc799ca4298dd05365efc: Status 404 returned error can't find the container with id 377181858c138654fe5d57b723f8b3fc63ec37b8536cc799ca4298dd05365efc Oct 06 08:21:42 crc kubenswrapper[4991]: W1006 08:21:42.954155 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1821f7df_bab1_4667_aba0_e5ccd51d187e.slice/crio-20f75210f8806641957466156fd752be7439fbf5413ad76fff1aee0caa0294f0 WatchSource:0}: Error finding container 20f75210f8806641957466156fd752be7439fbf5413ad76fff1aee0caa0294f0: Status 404 returned error can't find the container with id 20f75210f8806641957466156fd752be7439fbf5413ad76fff1aee0caa0294f0 Oct 06 08:21:42 crc kubenswrapper[4991]: I1006 08:21:42.978981 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pgp24"] Oct 06 08:21:42 crc kubenswrapper[4991]: W1006 08:21:42.986994 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f68050b_a4cf_4dd6_bfaa_6ea9674af578.slice/crio-3f5fe21de083121a3f0280388d90ea4205afe21cb73cbb218cd8baf8154f6911 WatchSource:0}: Error finding container 3f5fe21de083121a3f0280388d90ea4205afe21cb73cbb218cd8baf8154f6911: Status 404 returned error can't find the container with id 3f5fe21de083121a3f0280388d90ea4205afe21cb73cbb218cd8baf8154f6911 Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:42.999736 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.000667 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:43.500640596 +0000 UTC m=+155.238390617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.101933 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w8qs2"] Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.109263 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.109686 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:43.609671986 +0000 UTC m=+155.347422007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.213869 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.214163 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:43.714143921 +0000 UTC m=+155.451893942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:43 crc kubenswrapper[4991]: W1006 08:21:43.227468 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20802e3a_7d2a_43a1_83cb_e50b6ed66d96.slice/crio-b0de29713b90bacfb0675108796b86161382637d807754f007843c3751c55055 WatchSource:0}: Error finding container b0de29713b90bacfb0675108796b86161382637d807754f007843c3751c55055: Status 404 returned error can't find the container with id b0de29713b90bacfb0675108796b86161382637d807754f007843c3751c55055 Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.315996 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.316573 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:43.816556859 +0000 UTC m=+155.554306880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.403390 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" event={"ID":"8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7","Type":"ContainerStarted","Data":"80775c701c784c8a42aa648bbb97ee12bd99726ae8abd6102f2d29151fb7acd1"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.404236 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.405052 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" event={"ID":"9877d0f7-2431-42fb-b4aa-5d612cdb417f","Type":"ContainerStarted","Data":"254e5bcfea6ce60980581b24aaeb37fe62cb53251dfbceaf4a76be7e9ea6978f"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.407830 4991 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bps7p container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.407890 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" podUID="8f9ccdf3-5ec0-428c-a9a9-da1d6967e6d7" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.411389 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" event={"ID":"0a7333dc-b6d2-4513-8574-a95446be656b","Type":"ContainerStarted","Data":"30f04d0eac527ef4689ecdc50654f18fb1b7ed928b5a15280082a80724283473"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.418946 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.419115 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:43.919086131 +0000 UTC m=+155.656836152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.419274 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.419763 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:43.919745979 +0000 UTC m=+155.657496000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.421467 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8zvhl" event={"ID":"14e2aab8-50a5-4db6-9efa-579949a454bb","Type":"ContainerStarted","Data":"3ef866a5391ee177f2b9c6519fd8e65666ee38d908d7a59dec208d0895351092"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.421519 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8zvhl" event={"ID":"14e2aab8-50a5-4db6-9efa-579949a454bb","Type":"ContainerStarted","Data":"a8047fb4df9631fba3adf576aa95f3803bbdf2347b88821723918b3dd256a777"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.424002 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" event={"ID":"7f68050b-a4cf-4dd6-bfaa-6ea9674af578","Type":"ContainerStarted","Data":"3f5fe21de083121a3f0280388d90ea4205afe21cb73cbb218cd8baf8154f6911"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.427141 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" event={"ID":"8055779b-d4d8-48ce-bb04-f49073e28dc1","Type":"ContainerStarted","Data":"945c85c79920888361ecda40a6f8ac9f7d69fc2fd70f46d918ed220498aa8c3c"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.429785 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" event={"ID":"02f0b661-1f67-4d17-ae49-2c0c703a50ee","Type":"ContainerStarted","Data":"b0f3df400ba10347c9dd17bce1d854eb751a8246b7d74c3c7205513081169d00"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.432801 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" event={"ID":"3642df91-ce0a-494a-a3a6-58c5ae92f69a","Type":"ContainerStarted","Data":"04e6814a534255f0816ccf78cca507ae8f030aef05ac998d392f338c1bb13905"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.435238 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pgp24" event={"ID":"6603119d-4075-47ce-9cc8-4d030353dffa","Type":"ContainerStarted","Data":"74513f74f1856602d813214dd5e8c7864c18eb3986d9cfb55b9dae53ace0a0b9"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.437645 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" event={"ID":"137e3b24-3531-43e3-8bdd-17dda6d83922","Type":"ContainerStarted","Data":"445db75676fa79a491ab739a8abda2c92da8739d3239a67364e8e0901d755cfb"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.444487 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" event={"ID":"ac30cd53-f61e-4f56-8110-4eacc0aade3f","Type":"ContainerStarted","Data":"c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.444993 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.446594 4991 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-vtcb6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" start-of-body= Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.446634 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" podUID="ac30cd53-f61e-4f56-8110-4eacc0aade3f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.451236 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" event={"ID":"f7760a6d-1e1d-413f-9fbe-e89e4d621ff6","Type":"ContainerStarted","Data":"3e3cdb78df0e25eb0d15102f0f1caf849b44226f04b6980172fcddeccd2f4afb"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.451306 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" event={"ID":"f7760a6d-1e1d-413f-9fbe-e89e4d621ff6","Type":"ContainerStarted","Data":"6f4c4f986a66b21998b7c6cb16bf565b281717af9299af36460fb2f9724e4c6c"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.456720 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" event={"ID":"1c08ea1b-603c-4e53-89b6-bff65bd7154f","Type":"ContainerStarted","Data":"f9a078d673cca7884f350e8cfdf5a1e344b4410b959c0796395a746176c70244"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.456766 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" event={"ID":"1c08ea1b-603c-4e53-89b6-bff65bd7154f","Type":"ContainerStarted","Data":"5d82f4cda02d619d1ab8a49056688b289ab33a648ae8c1859e925fbeee2d2a91"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.457821 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.464060 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" event={"ID":"227e09e3-5d47-4490-9135-0ecb29c18623","Type":"ContainerStarted","Data":"a27120458c4dd8fde21cbc6bc5087829e8678d3095625173689d7ed0cb1941a7"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.471239 4991 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rd6sk container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.471344 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" podUID="1c08ea1b-603c-4e53-89b6-bff65bd7154f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.474775 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" event={"ID":"f4ef6468-c4e0-4a26-820b-ddd444b50a07","Type":"ContainerStarted","Data":"374db1bb08a8fdd28dbe241bb6c2e81d39360e0b1bb0ba55f97d680a4cba2beb"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.475249 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.484747 4991 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-lcvbr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.484811 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" podUID="f4ef6468-c4e0-4a26-820b-ddd444b50a07" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.501249 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" event={"ID":"d3ae516b-0866-40bc-b886-44111fef9329","Type":"ContainerStarted","Data":"f85549c54b83706d5b8a0eed930fc4fbf66d45ecb29cfa0ca47a512f9666fb55"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.527127 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" event={"ID":"ce698484-23c3-49bc-94f8-6a5fc2efccf5","Type":"ContainerStarted","Data":"377181858c138654fe5d57b723f8b3fc63ec37b8536cc799ca4298dd05365efc"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.528329 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.528418 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:44.028399699 +0000 UTC m=+155.766149720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.528608 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.530120 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:44.030103956 +0000 UTC m=+155.767853977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.553962 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-l2z7h" event={"ID":"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d","Type":"ContainerStarted","Data":"0c4b60a52bfed57e7b18f5641a156f6d28128f0d56726385e619d5b9baaf2c25"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.561227 4991 generic.go:334] "Generic (PLEG): container finished" podID="b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5" containerID="08cbc06b3956fdd0be666d22e49e6da674740b3546b10d172157fb47489b7e1d" exitCode=0 Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.563533 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" event={"ID":"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5","Type":"ContainerDied","Data":"08cbc06b3956fdd0be666d22e49e6da674740b3546b10d172157fb47489b7e1d"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.584240 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" event={"ID":"0a1ec52a-7fbe-4c4c-aa0a-abe8776e6c7d","Type":"ContainerStarted","Data":"0606d023c20139ee4765f796ae793328035f5f1e391a6e4d38e7fc0092ece13c"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.596144 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" podStartSLOduration=130.596120219 podStartE2EDuration="2m10.596120219s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:43.580192149 +0000 UTC m=+155.317942170" watchObservedRunningTime="2025-10-06 08:21:43.596120219 +0000 UTC m=+155.333870240" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.610431 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" podStartSLOduration=129.610410734 podStartE2EDuration="2m9.610410734s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:43.553244386 +0000 UTC m=+155.290994407" watchObservedRunningTime="2025-10-06 08:21:43.610410734 +0000 UTC m=+155.348160755" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.616750 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mwdsr" event={"ID":"d3960fea-a405-4e1e-b9a8-14c574ad45e8","Type":"ContainerStarted","Data":"07ed2537c75957e71af9ac5699e227a49eb66a94a6b131558a071d354d1bf6a6"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.627898 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kp5gc" event={"ID":"c941e944-a837-41ff-90b0-29464fc3f02d","Type":"ContainerStarted","Data":"26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.628149 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kp5gc" event={"ID":"c941e944-a837-41ff-90b0-29464fc3f02d","Type":"ContainerStarted","Data":"c7b4747145a0e8e248704bc7f148d93e5d5ffec6d93f46bdde0f648f083c1051"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.629412 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.632276 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:44.132256227 +0000 UTC m=+155.870006248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.645360 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" event={"ID":"1821f7df-bab1-4667-aba0-e5ccd51d187e","Type":"ContainerStarted","Data":"20f75210f8806641957466156fd752be7439fbf5413ad76fff1aee0caa0294f0"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.650605 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" event={"ID":"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1","Type":"ContainerStarted","Data":"3ffdb1d9bd54772e3a89523e01ae03163defafeec42d15a6c01c3c8fc6b99750"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.651737 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" event={"ID":"e1bd509b-00aa-4e06-88fd-6849dbeab980","Type":"ContainerStarted","Data":"0407b57d71574a73f2eceb0cd64a8385e6fd29271999cfe82dd7660afa90fcb0"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.660148 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" event={"ID":"8f5f1533-ca00-4377-853e-c5433faa591e","Type":"ContainerStarted","Data":"0156ca79eff97c77b7d3e23dfdcf3e4f177bcc6bb87ebe027661099c40fb0675"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.663868 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt" event={"ID":"f65e5c1d-a9f2-4954-b72e-27f2d2895ac0","Type":"ContainerStarted","Data":"8a876d99ee5599d9b85cf9ac6b0ce4d2e959e3ade295536641bd4a805694ed23"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.678995 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-nghng" podStartSLOduration=130.678969386 podStartE2EDuration="2m10.678969386s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:43.646017147 +0000 UTC m=+155.383767168" watchObservedRunningTime="2025-10-06 08:21:43.678969386 +0000 UTC m=+155.416719397" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.704193 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nxbxt" event={"ID":"bde59828-827b-4873-b51d-34038c9ab9ca","Type":"ContainerStarted","Data":"9337dba7596a3a1191aed3c0043bf609ba67738598712da87e4179b917529298"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.722005 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lv2fp" podStartSLOduration=129.721823991 podStartE2EDuration="2m9.721823991s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:43.687842842 +0000 UTC m=+155.425592863" watchObservedRunningTime="2025-10-06 08:21:43.721823991 +0000 UTC m=+155.459574012" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.723646 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" podStartSLOduration=129.723609859 podStartE2EDuration="2m9.723609859s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:43.719530097 +0000 UTC m=+155.457280118" watchObservedRunningTime="2025-10-06 08:21:43.723609859 +0000 UTC m=+155.461359880" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.730897 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.731362 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:44.231348033 +0000 UTC m=+155.969098054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.735642 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9nxks" event={"ID":"24b7d367-f4b1-4ba9-b1f0-77e706c85ac4","Type":"ContainerStarted","Data":"6d9a3a1d864e77caeee293652e6c137722c07d6281fc24d3c9c75c0845447ff4"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.739063 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2" event={"ID":"11045b9f-1d93-4f1d-852e-02354ef51979","Type":"ContainerStarted","Data":"d61a57859628323a145fa6ea928bee5c78dcc8fca2121dbccdfbc29dc9875b59"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.739131 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2" event={"ID":"11045b9f-1d93-4f1d-852e-02354ef51979","Type":"ContainerStarted","Data":"82204e100215c2518a92f187948177f93c251445ffc183217123063f3ee0b86b"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.756069 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4pctg" event={"ID":"adf23d88-9ae5-46c1-bb4b-fa513ab5beb6","Type":"ContainerStarted","Data":"b75abf1db031699f7d8a24434be325c65e7592df0ce44b6b141422c1cc533388"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.789168 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" event={"ID":"33d9fca8-faa3-43de-aee2-25ca05c03ab2","Type":"ContainerStarted","Data":"7b0d91f4f8b85396a8b62074f4d12c2a184b176c9a2d99e93a563471a0b48c2f"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.807728 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" podStartSLOduration=129.807709841 podStartE2EDuration="2m9.807709841s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:43.764625292 +0000 UTC m=+155.502375313" watchObservedRunningTime="2025-10-06 08:21:43.807709841 +0000 UTC m=+155.545459862" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.820846 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-g2mjl" event={"ID":"20560fe2-fd64-4aa1-9d9c-0f0046a10141","Type":"ContainerStarted","Data":"9c1c17306ffa0ec03a16108403ce27e948bc206882d72e5d0bd408c958dd305f"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.821773 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" podStartSLOduration=130.821730759 podStartE2EDuration="2m10.821730759s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:43.805954804 +0000 UTC m=+155.543704815" watchObservedRunningTime="2025-10-06 08:21:43.821730759 +0000 UTC m=+155.559480780" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.825346 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" event={"ID":"4af307fe-9280-424c-8cba-6f63aead910b","Type":"ContainerStarted","Data":"a91f62787f360ea8f0093a16d7512196485eff31d6d8403f7f83844eee26c6bc"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.831903 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.831981 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:44.331962601 +0000 UTC m=+156.069712622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.838676 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" event={"ID":"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6","Type":"ContainerStarted","Data":"9d059a390efce9306ace6c0884aa87177d68702e43b9919e0c04a90cc4bc1891"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.839140 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w8qs2" event={"ID":"20802e3a-7d2a-43a1-83cb-e50b6ed66d96","Type":"ContainerStarted","Data":"b0de29713b90bacfb0675108796b86161382637d807754f007843c3751c55055"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.839361 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.841374 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:44.341356991 +0000 UTC m=+156.079107012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.843545 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mwdsr" podStartSLOduration=5.8435321810000005 podStartE2EDuration="5.843532181s" podCreationTimestamp="2025-10-06 08:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:43.84170752 +0000 UTC m=+155.579457541" watchObservedRunningTime="2025-10-06 08:21:43.843532181 +0000 UTC m=+155.581282202" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.877756 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" event={"ID":"337bd770-81c5-466f-a32e-9fef462765c8","Type":"ContainerStarted","Data":"ff48484ea1296ed17db6c9e8e21f2c2787850331b84351fe36a055a3d9e9111a"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.877892 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.897931 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" event={"ID":"15148cd6-6d64-4a92-a334-b5014bf8b05a","Type":"ContainerStarted","Data":"054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6"} Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.898828 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.927930 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kw55l" podStartSLOduration=130.927912851 podStartE2EDuration="2m10.927912851s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:43.92571283 +0000 UTC m=+155.663462871" watchObservedRunningTime="2025-10-06 08:21:43.927912851 +0000 UTC m=+155.665662862" Oct 06 08:21:43 crc kubenswrapper[4991]: I1006 08:21:43.941397 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:43 crc kubenswrapper[4991]: E1006 08:21:43.942680 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:44.442662309 +0000 UTC m=+156.180412330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.006572 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vxbrw" podStartSLOduration=130.006549592 podStartE2EDuration="2m10.006549592s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:43.961001625 +0000 UTC m=+155.698751636" watchObservedRunningTime="2025-10-06 08:21:44.006549592 +0000 UTC m=+155.744299613" Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.008582 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-kp5gc" podStartSLOduration=131.008569478 podStartE2EDuration="2m11.008569478s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:44.006680866 +0000 UTC m=+155.744430887" watchObservedRunningTime="2025-10-06 08:21:44.008569478 +0000 UTC m=+155.746319499" Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.042548 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:44 crc kubenswrapper[4991]: E1006 08:21:44.043042 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:44.543026089 +0000 UTC m=+156.280776110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.047924 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" podStartSLOduration=130.047903364 podStartE2EDuration="2m10.047903364s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:44.046787444 +0000 UTC m=+155.784537465" watchObservedRunningTime="2025-10-06 08:21:44.047903364 +0000 UTC m=+155.785653385" Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.093269 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-g2mjl" podStartSLOduration=130.093251636 podStartE2EDuration="2m10.093251636s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:44.092262459 +0000 UTC m=+155.830012480" watchObservedRunningTime="2025-10-06 08:21:44.093251636 +0000 UTC m=+155.831001657" Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.134560 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" podStartSLOduration=131.133886929 podStartE2EDuration="2m11.133886929s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:44.130832035 +0000 UTC m=+155.868582056" watchObservedRunningTime="2025-10-06 08:21:44.133886929 +0000 UTC m=+155.871636950" Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.143358 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:44 crc kubenswrapper[4991]: E1006 08:21:44.143693 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:44.643676609 +0000 UTC m=+156.381426630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.247632 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:44 crc kubenswrapper[4991]: E1006 08:21:44.248002 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:44.74798872 +0000 UTC m=+156.485738741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.291720 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.349009 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:44 crc kubenswrapper[4991]: E1006 08:21:44.349418 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:44.84940078 +0000 UTC m=+156.587150801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.451482 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:44 crc kubenswrapper[4991]: E1006 08:21:44.454522 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:44.952704693 +0000 UTC m=+156.690454854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.554282 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:44 crc kubenswrapper[4991]: E1006 08:21:44.554874 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:45.054850903 +0000 UTC m=+156.792600924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.655759 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.655979 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:44 crc kubenswrapper[4991]: E1006 08:21:44.656768 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:45.156751846 +0000 UTC m=+156.894501867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.669296 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.717888 4991 patch_prober.go:28] interesting pod/apiserver-76f77b778f-d2pr9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 06 08:21:44 crc kubenswrapper[4991]: [+]log ok Oct 06 08:21:44 crc kubenswrapper[4991]: [+]etcd ok Oct 06 08:21:44 crc kubenswrapper[4991]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 06 08:21:44 crc kubenswrapper[4991]: [+]poststarthook/generic-apiserver-start-informers ok Oct 06 08:21:44 crc kubenswrapper[4991]: [+]poststarthook/max-in-flight-filter ok Oct 06 08:21:44 crc kubenswrapper[4991]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 06 08:21:44 crc kubenswrapper[4991]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 06 08:21:44 crc kubenswrapper[4991]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 06 08:21:44 crc kubenswrapper[4991]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 06 08:21:44 crc kubenswrapper[4991]: [+]poststarthook/project.openshift.io-projectcache ok Oct 06 08:21:44 crc kubenswrapper[4991]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 06 08:21:44 crc kubenswrapper[4991]: [+]poststarthook/openshift.io-startinformers ok Oct 06 08:21:44 crc kubenswrapper[4991]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 06 08:21:44 crc kubenswrapper[4991]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 06 08:21:44 crc kubenswrapper[4991]: livez check failed Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.718630 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" podUID="d3ae516b-0866-40bc-b886-44111fef9329" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.759754 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.760181 4991 patch_prober.go:28] interesting pod/router-default-5444994796-g2mjl container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.760249 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g2mjl" podUID="20560fe2-fd64-4aa1-9d9c-0f0046a10141" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.768005 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:44 crc kubenswrapper[4991]: E1006 08:21:44.768978 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:45.268950834 +0000 UTC m=+157.006700865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.769540 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:44 crc kubenswrapper[4991]: E1006 08:21:44.769927 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:45.269916632 +0000 UTC m=+157.007666653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.871848 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:44 crc kubenswrapper[4991]: E1006 08:21:44.872685 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:45.372664189 +0000 UTC m=+157.110414210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.918720 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-g2mjl" event={"ID":"20560fe2-fd64-4aa1-9d9c-0f0046a10141","Type":"ContainerStarted","Data":"ad1e91254e184483beb021b1a114b5b6c7d0afdcfe221571e31669c43193d08e"} Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.928581 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" event={"ID":"ce698484-23c3-49bc-94f8-6a5fc2efccf5","Type":"ContainerStarted","Data":"086e9f127c775d4d10af4ff3546d33060a177e9124fe84a01e054093f7771f30"} Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.933874 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" event={"ID":"227e09e3-5d47-4490-9135-0ecb29c18623","Type":"ContainerStarted","Data":"89d282d1cfb372f26b7257bb061029aa7b072ddfbe4dd731ddcb241386b53b63"} Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.974663 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:44 crc kubenswrapper[4991]: E1006 08:21:44.975681 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:45.475662162 +0000 UTC m=+157.213412183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.984771 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" event={"ID":"8a8266da-ca7f-4357-8aa7-86aaa7fb23c6","Type":"ContainerStarted","Data":"46720104ac03a36f7d74a093e8574935a9e4e3be381edb73fcab784e723f4998"} Oct 06 08:21:44 crc kubenswrapper[4991]: I1006 08:21:44.996516 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-t6t2r" podStartSLOduration=130.996492898 podStartE2EDuration="2m10.996492898s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:44.961511502 +0000 UTC m=+156.699261523" watchObservedRunningTime="2025-10-06 08:21:44.996492898 +0000 UTC m=+156.734242919" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.018469 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dlk6z" podStartSLOduration=131.018449135 podStartE2EDuration="2m11.018449135s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:44.997026933 +0000 UTC m=+156.734776954" watchObservedRunningTime="2025-10-06 08:21:45.018449135 +0000 UTC m=+156.756199156" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.020768 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-vr6sj" podStartSLOduration=131.020758178 podStartE2EDuration="2m11.020758178s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.018882156 +0000 UTC m=+156.756632167" watchObservedRunningTime="2025-10-06 08:21:45.020758178 +0000 UTC m=+156.758508199" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.021749 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w8qs2" event={"ID":"20802e3a-7d2a-43a1-83cb-e50b6ed66d96","Type":"ContainerStarted","Data":"def689bf38a0f450a9cce0374a35791324580a0c5e39571d1856b9226416ac95"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.028636 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" event={"ID":"137e3b24-3531-43e3-8bdd-17dda6d83922","Type":"ContainerStarted","Data":"d86622f3ed61807588a5f30d6181e2480e33e951079b73e782c943b2f042c1f7"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.028708 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" event={"ID":"137e3b24-3531-43e3-8bdd-17dda6d83922","Type":"ContainerStarted","Data":"c4da267ec9e2d5355dbf300aba3326c54ddce6d4236dfae67a03879f3cf68854"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.040390 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.055514 4991 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-lq6ks container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" start-of-body= Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.055598 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" podUID="e1bd509b-00aa-4e06-88fd-6849dbeab980" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.056356 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" event={"ID":"0a7333dc-b6d2-4513-8574-a95446be656b","Type":"ContainerStarted","Data":"658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.056901 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.061485 4991 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p5tk4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.061558 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" podUID="0a7333dc-b6d2-4513-8574-a95446be656b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.069047 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4pctg" event={"ID":"adf23d88-9ae5-46c1-bb4b-fa513ab5beb6","Type":"ContainerStarted","Data":"b66971cc20093ab1c23dafa6741f99402eac1f8a5de8aef6513bb85315d4a411"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.075822 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:45 crc kubenswrapper[4991]: E1006 08:21:45.077335 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:45.57731884 +0000 UTC m=+157.315068861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.082552 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-w8qs2" podStartSLOduration=7.082535223 podStartE2EDuration="7.082535223s" podCreationTimestamp="2025-10-06 08:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.048798512 +0000 UTC m=+156.786548533" watchObservedRunningTime="2025-10-06 08:21:45.082535223 +0000 UTC m=+156.820285244" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.083103 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" podStartSLOduration=131.08309673 podStartE2EDuration="2m11.08309673s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.073665089 +0000 UTC m=+156.811415110" watchObservedRunningTime="2025-10-06 08:21:45.08309673 +0000 UTC m=+156.820846751" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.086792 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" event={"ID":"3642df91-ce0a-494a-a3a6-58c5ae92f69a","Type":"ContainerStarted","Data":"ceb31b930b43403c07c4bf71221623c74700b27f319dc544b07588f62fb61e8f"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.086927 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" event={"ID":"3642df91-ce0a-494a-a3a6-58c5ae92f69a","Type":"ContainerStarted","Data":"0ced242fbbd8fe70b6ec2124e90683e5253f850fea8b093cdb2f01ab13ab0f78"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.087862 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.097220 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hzl7x" podStartSLOduration=131.097196909 podStartE2EDuration="2m11.097196909s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.096443527 +0000 UTC m=+156.834193548" watchObservedRunningTime="2025-10-06 08:21:45.097196909 +0000 UTC m=+156.834946930" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.103585 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" event={"ID":"33d9fca8-faa3-43de-aee2-25ca05c03ab2","Type":"ContainerStarted","Data":"5eb80eb23bd0f29cd78dd73072af15c2013d887bace165eecb1d9d7fc8e85a53"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.120086 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" event={"ID":"7f68050b-a4cf-4dd6-bfaa-6ea9674af578","Type":"ContainerStarted","Data":"a9044a3b4e86ed9ab53c08413054cd21895787a9100cfa920a9e64dfa335cdbb"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.125194 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" event={"ID":"4af307fe-9280-424c-8cba-6f63aead910b","Type":"ContainerStarted","Data":"464f1e4e419a3a4b057a8c2fec7503a799426171e8b13cfe06ccaa2ada254c64"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.132112 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt" event={"ID":"f65e5c1d-a9f2-4954-b72e-27f2d2895ac0","Type":"ContainerStarted","Data":"2354a134e8108bea476527739aa8eba3bf39689e8a807d1522ad55ef64b06e58"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.137827 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" podStartSLOduration=131.13780537 podStartE2EDuration="2m11.13780537s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.132190335 +0000 UTC m=+156.869940356" watchObservedRunningTime="2025-10-06 08:21:45.13780537 +0000 UTC m=+156.875555391" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.142366 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" event={"ID":"1821f7df-bab1-4667-aba0-e5ccd51d187e","Type":"ContainerStarted","Data":"da077ffdb5e1dfdf86b269545cf269548b1a6fe69ea652053a4fc2752e014a90"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.162035 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" podStartSLOduration=131.162007288 podStartE2EDuration="2m11.162007288s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.153573126 +0000 UTC m=+156.891323147" watchObservedRunningTime="2025-10-06 08:21:45.162007288 +0000 UTC m=+156.899757309" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.177844 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:45 crc kubenswrapper[4991]: E1006 08:21:45.181818 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:45.681800115 +0000 UTC m=+157.419550136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.182627 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" event={"ID":"3bd5b603-a2f4-4935-9b39-3f3b42c9c0e1","Type":"ContainerStarted","Data":"23861a49b28ea60faeb6ac012696ec2e18bce0ee00803870c680dd8b4237da8d"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.190147 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-l2z7h" event={"ID":"2eff5942-a2f5-4b2e-8a8a-d1555ffe952d","Type":"ContainerStarted","Data":"be394b7eecc06bf18d82dd901b22846f5d238a7a9c104ac9133452fe1809d57a"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.191411 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.199865 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" event={"ID":"9877d0f7-2431-42fb-b4aa-5d612cdb417f","Type":"ContainerStarted","Data":"3586837d5592625ffc07fee0219f2804fb99397cf57adb9640ccca6c9f2c15c7"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.210372 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nxbxt" event={"ID":"bde59828-827b-4873-b51d-34038c9ab9ca","Type":"ContainerStarted","Data":"ca43804ba8c115e8e13475cb31c7b4820a5285e8609b34bebce0b9e5df19cef8"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.211449 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nxbxt" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.217249 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ptmj5" podStartSLOduration=131.217220153 podStartE2EDuration="2m11.217220153s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.210866567 +0000 UTC m=+156.948616588" watchObservedRunningTime="2025-10-06 08:21:45.217220153 +0000 UTC m=+156.954970174" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.232162 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lzn9r" podStartSLOduration=131.232139465 podStartE2EDuration="2m11.232139465s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.182269668 +0000 UTC m=+156.920019679" watchObservedRunningTime="2025-10-06 08:21:45.232139465 +0000 UTC m=+156.969889486" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.233659 4991 patch_prober.go:28] interesting pod/console-operator-58897d9998-l2z7h container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/readyz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.240365 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-l2z7h" podUID="2eff5942-a2f5-4b2e-8a8a-d1555ffe952d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/readyz\": dial tcp 10.217.0.40:8443: connect: connection refused" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.238413 4991 patch_prober.go:28] interesting pod/downloads-7954f5f757-nxbxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.241814 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nxbxt" podUID="bde59828-827b-4873-b51d-34038c9ab9ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.258498 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" podStartSLOduration=131.258465612 podStartE2EDuration="2m11.258465612s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.249065732 +0000 UTC m=+156.986815763" watchObservedRunningTime="2025-10-06 08:21:45.258465612 +0000 UTC m=+156.996215633" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.278939 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:45 crc kubenswrapper[4991]: E1006 08:21:45.279821 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:45.779785301 +0000 UTC m=+157.517535322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.281198 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:45 crc kubenswrapper[4991]: E1006 08:21:45.282955 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:45.782942318 +0000 UTC m=+157.520692339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.330483 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9nxks" event={"ID":"24b7d367-f4b1-4ba9-b1f0-77e706c85ac4","Type":"ContainerStarted","Data":"a149d0fcc8fef9e8963492212f04c92a1fb4030be8a2ff24d01371d7a04b4f4d"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.338696 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.338829 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.338909 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2" event={"ID":"11045b9f-1d93-4f1d-852e-02354ef51979","Type":"ContainerStarted","Data":"f73305349a4bf7e827d5d7b652175f1165aae814e5faddea20f07abc3b1e0f89"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.338987 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tgxlv" event={"ID":"c78d976c-800d-4739-bdd4-5b8e5943c0a5","Type":"ContainerStarted","Data":"728787f4cabb946fc007da4782cbc7baa5af27367cfaf4e6087539983a8bd63e"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.339054 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tgxlv" event={"ID":"c78d976c-800d-4739-bdd4-5b8e5943c0a5","Type":"ContainerStarted","Data":"773532dab1c76d313a50ef4c37b75f6edfd2566d00142b5192f9118e11a2bc55"} Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.342465 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bps7p" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.362003 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rd6sk" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.371872 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-8zvhl" podStartSLOduration=131.371835383 podStartE2EDuration="2m11.371835383s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.361699502 +0000 UTC m=+157.099449523" watchObservedRunningTime="2025-10-06 08:21:45.371835383 +0000 UTC m=+157.109585404" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.372225 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rjt" podStartSLOduration=131.372219353 podStartE2EDuration="2m11.372219353s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.276518891 +0000 UTC m=+157.014268932" watchObservedRunningTime="2025-10-06 08:21:45.372219353 +0000 UTC m=+157.109969374" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.385733 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:45 crc kubenswrapper[4991]: E1006 08:21:45.387391 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:45.887337 +0000 UTC m=+157.625087021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.442144 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nbqqm" podStartSLOduration=131.442126294 podStartE2EDuration="2m11.442126294s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.405607725 +0000 UTC m=+157.143357766" watchObservedRunningTime="2025-10-06 08:21:45.442126294 +0000 UTC m=+157.179876305" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.490445 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hbwft" podStartSLOduration=131.490419577 podStartE2EDuration="2m11.490419577s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.441762044 +0000 UTC m=+157.179512065" watchObservedRunningTime="2025-10-06 08:21:45.490419577 +0000 UTC m=+157.228169598" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.492105 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:45 crc kubenswrapper[4991]: E1006 08:21:45.492461 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:45.992447823 +0000 UTC m=+157.730197844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.539332 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6vv2" podStartSLOduration=132.539313477 podStartE2EDuration="2m12.539313477s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.537723583 +0000 UTC m=+157.275473604" watchObservedRunningTime="2025-10-06 08:21:45.539313477 +0000 UTC m=+157.277063488" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.539616 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tgxlv" podStartSLOduration=131.539611486 podStartE2EDuration="2m11.539611486s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.510960384 +0000 UTC m=+157.248710425" watchObservedRunningTime="2025-10-06 08:21:45.539611486 +0000 UTC m=+157.277361507" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.589262 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-nxbxt" podStartSLOduration=131.589242325 podStartE2EDuration="2m11.589242325s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.566758115 +0000 UTC m=+157.304508136" watchObservedRunningTime="2025-10-06 08:21:45.589242325 +0000 UTC m=+157.326992336" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.593455 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:45 crc kubenswrapper[4991]: E1006 08:21:45.593882 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:46.093866893 +0000 UTC m=+157.831616914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.615177 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-l2z7h" podStartSLOduration=132.615157382 podStartE2EDuration="2m12.615157382s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.591825007 +0000 UTC m=+157.329575028" watchObservedRunningTime="2025-10-06 08:21:45.615157382 +0000 UTC m=+157.352907403" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.616578 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g55zd"] Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.618151 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.627379 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.638804 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g55zd"] Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.694685 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.694753 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-utilities\") pod \"certified-operators-g55zd\" (UID: \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\") " pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.694790 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-catalog-content\") pod \"certified-operators-g55zd\" (UID: \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\") " pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.694834 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srrtd\" (UniqueName: \"kubernetes.io/projected/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-kube-api-access-srrtd\") pod \"certified-operators-g55zd\" (UID: \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\") " pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:21:45 crc kubenswrapper[4991]: E1006 08:21:45.695126 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:46.195113829 +0000 UTC m=+157.932863850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.710589 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9nxks" podStartSLOduration=7.710566516 podStartE2EDuration="7.710566516s" podCreationTimestamp="2025-10-06 08:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:45.667659031 +0000 UTC m=+157.405409052" watchObservedRunningTime="2025-10-06 08:21:45.710566516 +0000 UTC m=+157.448316537" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.768764 4991 patch_prober.go:28] interesting pod/router-default-5444994796-g2mjl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:21:45 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Oct 06 08:21:45 crc kubenswrapper[4991]: [+]process-running ok Oct 06 08:21:45 crc kubenswrapper[4991]: healthz check failed Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.768844 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g2mjl" podUID="20560fe2-fd64-4aa1-9d9c-0f0046a10141" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.795364 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.795532 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srrtd\" (UniqueName: \"kubernetes.io/projected/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-kube-api-access-srrtd\") pod \"certified-operators-g55zd\" (UID: \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\") " pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.795617 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-utilities\") pod \"certified-operators-g55zd\" (UID: \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\") " pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.795660 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-catalog-content\") pod \"certified-operators-g55zd\" (UID: \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\") " pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.796077 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-catalog-content\") pod \"certified-operators-g55zd\" (UID: \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\") " pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:21:45 crc kubenswrapper[4991]: E1006 08:21:45.796160 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:46.296143469 +0000 UTC m=+158.033893490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.796613 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-utilities\") pod \"certified-operators-g55zd\" (UID: \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\") " pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.826259 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fqgbm"] Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.827288 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:21:45 crc kubenswrapper[4991]: W1006 08:21:45.834076 4991 reflector.go:561] object-"openshift-marketplace"/"community-operators-dockercfg-dmngl": failed to list *v1.Secret: secrets "community-operators-dockercfg-dmngl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Oct 06 08:21:45 crc kubenswrapper[4991]: E1006 08:21:45.834123 4991 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"community-operators-dockercfg-dmngl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"community-operators-dockercfg-dmngl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.841362 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srrtd\" (UniqueName: \"kubernetes.io/projected/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-kube-api-access-srrtd\") pod \"certified-operators-g55zd\" (UID: \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\") " pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.854470 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fqgbm"] Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.910041 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9rcr\" (UniqueName: \"kubernetes.io/projected/632906da-50f0-468a-aac9-cb2aea39d813-kube-api-access-n9rcr\") pod \"community-operators-fqgbm\" (UID: \"632906da-50f0-468a-aac9-cb2aea39d813\") " pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.910101 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/632906da-50f0-468a-aac9-cb2aea39d813-utilities\") pod \"community-operators-fqgbm\" (UID: \"632906da-50f0-468a-aac9-cb2aea39d813\") " pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.910149 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.910201 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/632906da-50f0-468a-aac9-cb2aea39d813-catalog-content\") pod \"community-operators-fqgbm\" (UID: \"632906da-50f0-468a-aac9-cb2aea39d813\") " pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:21:45 crc kubenswrapper[4991]: E1006 08:21:45.910634 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:46.41061472 +0000 UTC m=+158.148364741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:45 crc kubenswrapper[4991]: I1006 08:21:45.950737 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.012842 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.013580 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/632906da-50f0-468a-aac9-cb2aea39d813-utilities\") pod \"community-operators-fqgbm\" (UID: \"632906da-50f0-468a-aac9-cb2aea39d813\") " pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.013753 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/632906da-50f0-468a-aac9-cb2aea39d813-catalog-content\") pod \"community-operators-fqgbm\" (UID: \"632906da-50f0-468a-aac9-cb2aea39d813\") " pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.013802 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9rcr\" (UniqueName: \"kubernetes.io/projected/632906da-50f0-468a-aac9-cb2aea39d813-kube-api-access-n9rcr\") pod \"community-operators-fqgbm\" (UID: \"632906da-50f0-468a-aac9-cb2aea39d813\") " pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.014362 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:46.514342524 +0000 UTC m=+158.252092545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.014815 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/632906da-50f0-468a-aac9-cb2aea39d813-utilities\") pod \"community-operators-fqgbm\" (UID: \"632906da-50f0-468a-aac9-cb2aea39d813\") " pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.015036 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/632906da-50f0-468a-aac9-cb2aea39d813-catalog-content\") pod \"community-operators-fqgbm\" (UID: \"632906da-50f0-468a-aac9-cb2aea39d813\") " pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.032499 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bch4t"] Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.033702 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.048815 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bch4t"] Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.092848 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9rcr\" (UniqueName: \"kubernetes.io/projected/632906da-50f0-468a-aac9-cb2aea39d813-kube-api-access-n9rcr\") pod \"community-operators-fqgbm\" (UID: \"632906da-50f0-468a-aac9-cb2aea39d813\") " pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.115092 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.115170 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd33015-5eee-4441-a373-4b062b28fefd-catalog-content\") pod \"certified-operators-bch4t\" (UID: \"3fd33015-5eee-4441-a373-4b062b28fefd\") " pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.115242 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd33015-5eee-4441-a373-4b062b28fefd-utilities\") pod \"certified-operators-bch4t\" (UID: \"3fd33015-5eee-4441-a373-4b062b28fefd\") " pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.115262 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg8fv\" (UniqueName: \"kubernetes.io/projected/3fd33015-5eee-4441-a373-4b062b28fefd-kube-api-access-tg8fv\") pod \"certified-operators-bch4t\" (UID: \"3fd33015-5eee-4441-a373-4b062b28fefd\") " pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.115461 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:46.615442776 +0000 UTC m=+158.353192797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.218320 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.218614 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:46.718575934 +0000 UTC m=+158.456325955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.219132 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.219162 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd33015-5eee-4441-a373-4b062b28fefd-catalog-content\") pod \"certified-operators-bch4t\" (UID: \"3fd33015-5eee-4441-a373-4b062b28fefd\") " pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.219213 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd33015-5eee-4441-a373-4b062b28fefd-utilities\") pod \"certified-operators-bch4t\" (UID: \"3fd33015-5eee-4441-a373-4b062b28fefd\") " pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.219231 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg8fv\" (UniqueName: \"kubernetes.io/projected/3fd33015-5eee-4441-a373-4b062b28fefd-kube-api-access-tg8fv\") pod \"certified-operators-bch4t\" (UID: \"3fd33015-5eee-4441-a373-4b062b28fefd\") " pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.219974 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:46.719957461 +0000 UTC m=+158.457707482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.220613 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd33015-5eee-4441-a373-4b062b28fefd-catalog-content\") pod \"certified-operators-bch4t\" (UID: \"3fd33015-5eee-4441-a373-4b062b28fefd\") " pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.220860 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd33015-5eee-4441-a373-4b062b28fefd-utilities\") pod \"certified-operators-bch4t\" (UID: \"3fd33015-5eee-4441-a373-4b062b28fefd\") " pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.233625 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wgkf7"] Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.247563 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.274817 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg8fv\" (UniqueName: \"kubernetes.io/projected/3fd33015-5eee-4441-a373-4b062b28fefd-kube-api-access-tg8fv\") pod \"certified-operators-bch4t\" (UID: \"3fd33015-5eee-4441-a373-4b062b28fefd\") " pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.277277 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wgkf7"] Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.332965 4991 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-vtcb6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.6:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.333118 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" podUID="ac30cd53-f61e-4f56-8110-4eacc0aade3f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.6:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.333346 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.333748 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j82bv\" (UniqueName: \"kubernetes.io/projected/2541e35c-acef-49c6-8117-1eaefe92a7b5-kube-api-access-j82bv\") pod \"community-operators-wgkf7\" (UID: \"2541e35c-acef-49c6-8117-1eaefe92a7b5\") " pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.333826 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:46.833811546 +0000 UTC m=+158.571561567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.333991 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2541e35c-acef-49c6-8117-1eaefe92a7b5-catalog-content\") pod \"community-operators-wgkf7\" (UID: \"2541e35c-acef-49c6-8117-1eaefe92a7b5\") " pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.334142 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.334407 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2541e35c-acef-49c6-8117-1eaefe92a7b5-utilities\") pod \"community-operators-wgkf7\" (UID: \"2541e35c-acef-49c6-8117-1eaefe92a7b5\") " pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.334725 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:46.83471825 +0000 UTC m=+158.572468271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.354150 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.380382 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" event={"ID":"e1bd509b-00aa-4e06-88fd-6849dbeab980","Type":"ContainerStarted","Data":"ba22f637859c22c56c6962a68162d9d350d9fdb5c5b9a3b7d4a485aa0318d8fa"} Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.382971 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8zvhl" event={"ID":"14e2aab8-50a5-4db6-9efa-579949a454bb","Type":"ContainerStarted","Data":"ecdee87d44e09db54f7a99f564d0ae3219ad9e31e569727c37048eb6b4cda51a"} Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.401070 4991 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-lq6ks container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" start-of-body= Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.401128 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" podUID="e1bd509b-00aa-4e06-88fd-6849dbeab980" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.402910 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4pctg" event={"ID":"adf23d88-9ae5-46c1-bb4b-fa513ab5beb6","Type":"ContainerStarted","Data":"14a32e5fbc0f7ea925d35d497412af4b94f8ee9e84b0f610f9f0b8f4fb8c6e4d"} Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.426520 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" event={"ID":"02f0b661-1f67-4d17-ae49-2c0c703a50ee","Type":"ContainerStarted","Data":"33c9d1b0a0323cba425549d6797a9be0ab9d184a57d2bce9177d5734294de54b"} Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.438190 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.438410 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2541e35c-acef-49c6-8117-1eaefe92a7b5-utilities\") pod \"community-operators-wgkf7\" (UID: \"2541e35c-acef-49c6-8117-1eaefe92a7b5\") " pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.438464 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j82bv\" (UniqueName: \"kubernetes.io/projected/2541e35c-acef-49c6-8117-1eaefe92a7b5-kube-api-access-j82bv\") pod \"community-operators-wgkf7\" (UID: \"2541e35c-acef-49c6-8117-1eaefe92a7b5\") " pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.438525 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2541e35c-acef-49c6-8117-1eaefe92a7b5-catalog-content\") pod \"community-operators-wgkf7\" (UID: \"2541e35c-acef-49c6-8117-1eaefe92a7b5\") " pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.439524 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:46.939506335 +0000 UTC m=+158.677256346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.444820 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2541e35c-acef-49c6-8117-1eaefe92a7b5-catalog-content\") pod \"community-operators-wgkf7\" (UID: \"2541e35c-acef-49c6-8117-1eaefe92a7b5\") " pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.451767 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2541e35c-acef-49c6-8117-1eaefe92a7b5-utilities\") pod \"community-operators-wgkf7\" (UID: \"2541e35c-acef-49c6-8117-1eaefe92a7b5\") " pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.480623 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" event={"ID":"33d9fca8-faa3-43de-aee2-25ca05c03ab2","Type":"ContainerStarted","Data":"15f470c2ea7e77a8b309283b8dde30d9ba1c629029749acdd773d89146abd8ab"} Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.483396 4991 generic.go:334] "Generic (PLEG): container finished" podID="1aab780d-af84-45fa-bc9c-b728d4e196d1" containerID="ec5697ed95fa543f6e1d7395b7ab21c28a99c64c2fee9d4f14b1e4db062368d8" exitCode=0 Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.483479 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" event={"ID":"1aab780d-af84-45fa-bc9c-b728d4e196d1","Type":"ContainerDied","Data":"ec5697ed95fa543f6e1d7395b7ab21c28a99c64c2fee9d4f14b1e4db062368d8"} Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.525814 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" event={"ID":"b0f81bad-4231-4e5b-bdb4-bd57fd0cddc5","Type":"ContainerStarted","Data":"e507732eb5023339e010db3ee6e8801eaf68c39c6e2719b4f05d9fa86aac9469"} Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.540429 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.542313 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:47.042281693 +0000 UTC m=+158.780031714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.558574 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lct7m" event={"ID":"4af307fe-9280-424c-8cba-6f63aead910b","Type":"ContainerStarted","Data":"68668d34d9ee1f7d84d5c304413aa32178b1045f2194adff95c09f7427f7a2c5"} Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.570899 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-4pctg" podStartSLOduration=132.570883743 podStartE2EDuration="2m12.570883743s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:46.568783634 +0000 UTC m=+158.306533665" watchObservedRunningTime="2025-10-06 08:21:46.570883743 +0000 UTC m=+158.308633764" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.579349 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pgp24" event={"ID":"6603119d-4075-47ce-9cc8-4d030353dffa","Type":"ContainerStarted","Data":"fc1704beb79195bb16a5227628311b792cd649c88766f8514af449dda11de04e"} Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.599738 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9nxks" event={"ID":"24b7d367-f4b1-4ba9-b1f0-77e706c85ac4","Type":"ContainerStarted","Data":"8e8f0c21f419d4e90fa0d6faf8ff70d7dd7313f22d68d2395094308ba336b18d"} Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.601467 4991 patch_prober.go:28] interesting pod/console-operator-58897d9998-l2z7h container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/readyz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.601525 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-l2z7h" podUID="2eff5942-a2f5-4b2e-8a8a-d1555ffe952d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/readyz\": dial tcp 10.217.0.40:8443: connect: connection refused" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.603698 4991 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p5tk4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.603758 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" podUID="0a7333dc-b6d2-4513-8574-a95446be656b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.603860 4991 patch_prober.go:28] interesting pod/downloads-7954f5f757-nxbxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.603882 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nxbxt" podUID="bde59828-827b-4873-b51d-34038c9ab9ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.642058 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.643281 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j82bv\" (UniqueName: \"kubernetes.io/projected/2541e35c-acef-49c6-8117-1eaefe92a7b5-kube-api-access-j82bv\") pod \"community-operators-wgkf7\" (UID: \"2541e35c-acef-49c6-8117-1eaefe92a7b5\") " pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.644137 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.644515 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:47.144503735 +0000 UTC m=+158.882253756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.647677 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g55zd"] Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.676370 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" podStartSLOduration=132.676348164 podStartE2EDuration="2m12.676348164s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:46.676010685 +0000 UTC m=+158.413760716" watchObservedRunningTime="2025-10-06 08:21:46.676348164 +0000 UTC m=+158.414098185" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.747068 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.796750 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:47.296725338 +0000 UTC m=+159.034475359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.747071 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6zk95" podStartSLOduration=133.747051457 podStartE2EDuration="2m13.747051457s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:46.741889695 +0000 UTC m=+158.479639716" watchObservedRunningTime="2025-10-06 08:21:46.747051457 +0000 UTC m=+158.484801478" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.812637 4991 patch_prober.go:28] interesting pod/router-default-5444994796-g2mjl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:21:46 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Oct 06 08:21:46 crc kubenswrapper[4991]: [+]process-running ok Oct 06 08:21:46 crc kubenswrapper[4991]: healthz check failed Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.812717 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g2mjl" podUID="20560fe2-fd64-4aa1-9d9c-0f0046a10141" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.842190 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6g7qx" podStartSLOduration=132.842160044 podStartE2EDuration="2m12.842160044s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:46.825958586 +0000 UTC m=+158.563708607" watchObservedRunningTime="2025-10-06 08:21:46.842160044 +0000 UTC m=+158.579910065" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.850637 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.850982 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:47.350949716 +0000 UTC m=+159.088699727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.851563 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.852009 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:47.351993615 +0000 UTC m=+159.089743636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.944781 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gc8rv" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.952918 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:46 crc kubenswrapper[4991]: E1006 08:21:46.953452 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:47.453430435 +0000 UTC m=+159.191180456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.979763 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.981624 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:21:46 crc kubenswrapper[4991]: I1006 08:21:46.982505 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.058274 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:47 crc kubenswrapper[4991]: E1006 08:21:47.059068 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:47.559045092 +0000 UTC m=+159.296795113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.161963 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:47 crc kubenswrapper[4991]: E1006 08:21:47.162371 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:47.662350215 +0000 UTC m=+159.400100236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.263849 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:47 crc kubenswrapper[4991]: E1006 08:21:47.264203 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:47.764191207 +0000 UTC m=+159.501941228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.367886 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:47 crc kubenswrapper[4991]: E1006 08:21:47.368244 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:47.868226519 +0000 UTC m=+159.605976540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.470137 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:47 crc kubenswrapper[4991]: E1006 08:21:47.470561 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:47.970539055 +0000 UTC m=+159.708289076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.534363 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bch4t"] Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.571263 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:47 crc kubenswrapper[4991]: E1006 08:21:47.573515 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:48.073463797 +0000 UTC m=+159.811213818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.574818 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:47 crc kubenswrapper[4991]: E1006 08:21:47.575266 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:48.075250256 +0000 UTC m=+159.813000277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.675881 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:47 crc kubenswrapper[4991]: E1006 08:21:47.676154 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:48.176140632 +0000 UTC m=+159.913890653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.707783 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pgp24" event={"ID":"6603119d-4075-47ce-9cc8-4d030353dffa","Type":"ContainerStarted","Data":"f33a674ca481dd7812151aacca29c053800edafdb99e0025786feecf46803503"} Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.723846 4991 generic.go:334] "Generic (PLEG): container finished" podID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" containerID="32727a035cfea73e851ddc4b30fc09462ed5b2d87315f5bf6d3aada6308f93c4" exitCode=0 Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.724149 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g55zd" event={"ID":"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b","Type":"ContainerDied","Data":"32727a035cfea73e851ddc4b30fc09462ed5b2d87315f5bf6d3aada6308f93c4"} Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.724281 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g55zd" event={"ID":"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b","Type":"ContainerStarted","Data":"c5b7a66fe50c53bb868142fe7e6d5d230378501e61148b4ec803341259f3c792"} Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.725926 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.726951 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bch4t" event={"ID":"3fd33015-5eee-4441-a373-4b062b28fefd","Type":"ContainerStarted","Data":"7634aa47e8d6361de89e455b32844308475a697026b6e89f60ad4e52efb5ecce"} Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.728015 4991 patch_prober.go:28] interesting pod/downloads-7954f5f757-nxbxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.728057 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nxbxt" podUID="bde59828-827b-4873-b51d-34038c9ab9ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.767095 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-l2z7h" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.778762 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:47 crc kubenswrapper[4991]: E1006 08:21:47.780245 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:48.280225116 +0000 UTC m=+160.017975137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.782860 4991 patch_prober.go:28] interesting pod/router-default-5444994796-g2mjl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:21:47 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Oct 06 08:21:47 crc kubenswrapper[4991]: [+]process-running ok Oct 06 08:21:47 crc kubenswrapper[4991]: healthz check failed Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.782915 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g2mjl" podUID="20560fe2-fd64-4aa1-9d9c-0f0046a10141" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.805803 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wgkf7"] Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.826095 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hl6f9"] Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.827050 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.843038 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl6f9"] Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.843034 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.879388 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.879780 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f4da1f1-397f-4cb5-af9d-cb28306486a5-catalog-content\") pod \"redhat-marketplace-hl6f9\" (UID: \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\") " pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.879837 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f4da1f1-397f-4cb5-af9d-cb28306486a5-utilities\") pod \"redhat-marketplace-hl6f9\" (UID: \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\") " pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.880074 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5tnz\" (UniqueName: \"kubernetes.io/projected/6f4da1f1-397f-4cb5-af9d-cb28306486a5-kube-api-access-x5tnz\") pod \"redhat-marketplace-hl6f9\" (UID: \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\") " pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:21:47 crc kubenswrapper[4991]: E1006 08:21:47.886870 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:48.38684964 +0000 UTC m=+160.124599661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.923529 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lq6ks" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.989710 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f4da1f1-397f-4cb5-af9d-cb28306486a5-catalog-content\") pod \"redhat-marketplace-hl6f9\" (UID: \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\") " pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.989774 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.989798 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f4da1f1-397f-4cb5-af9d-cb28306486a5-utilities\") pod \"redhat-marketplace-hl6f9\" (UID: \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\") " pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.989865 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5tnz\" (UniqueName: \"kubernetes.io/projected/6f4da1f1-397f-4cb5-af9d-cb28306486a5-kube-api-access-x5tnz\") pod \"redhat-marketplace-hl6f9\" (UID: \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\") " pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.990859 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f4da1f1-397f-4cb5-af9d-cb28306486a5-catalog-content\") pod \"redhat-marketplace-hl6f9\" (UID: \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\") " pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:21:47 crc kubenswrapper[4991]: E1006 08:21:47.991211 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:48.491198472 +0000 UTC m=+160.228948493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:47 crc kubenswrapper[4991]: I1006 08:21:47.995929 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f4da1f1-397f-4cb5-af9d-cb28306486a5-utilities\") pod \"redhat-marketplace-hl6f9\" (UID: \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\") " pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.063497 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5tnz\" (UniqueName: \"kubernetes.io/projected/6f4da1f1-397f-4cb5-af9d-cb28306486a5-kube-api-access-x5tnz\") pod \"redhat-marketplace-hl6f9\" (UID: \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\") " pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.101786 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:48 crc kubenswrapper[4991]: E1006 08:21:48.101951 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:48.601920389 +0000 UTC m=+160.339670410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.102129 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:48 crc kubenswrapper[4991]: E1006 08:21:48.102957 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:48.602940258 +0000 UTC m=+160.340690279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.138351 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fqgbm"] Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.184016 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.202891 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:48 crc kubenswrapper[4991]: E1006 08:21:48.203240 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:48.703224536 +0000 UTC m=+160.440974547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.221952 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vzsxl"] Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.223063 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.256246 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzsxl"] Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.310404 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741cebf1-af9b-4287-9804-47d3c702882d-utilities\") pod \"redhat-marketplace-vzsxl\" (UID: \"741cebf1-af9b-4287-9804-47d3c702882d\") " pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.310475 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.310504 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxqrm\" (UniqueName: \"kubernetes.io/projected/741cebf1-af9b-4287-9804-47d3c702882d-kube-api-access-dxqrm\") pod \"redhat-marketplace-vzsxl\" (UID: \"741cebf1-af9b-4287-9804-47d3c702882d\") " pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.310592 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741cebf1-af9b-4287-9804-47d3c702882d-catalog-content\") pod \"redhat-marketplace-vzsxl\" (UID: \"741cebf1-af9b-4287-9804-47d3c702882d\") " pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:21:48 crc kubenswrapper[4991]: E1006 08:21:48.310960 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:48.810947861 +0000 UTC m=+160.548697882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.413163 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.413379 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741cebf1-af9b-4287-9804-47d3c702882d-catalog-content\") pod \"redhat-marketplace-vzsxl\" (UID: \"741cebf1-af9b-4287-9804-47d3c702882d\") " pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.413412 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741cebf1-af9b-4287-9804-47d3c702882d-utilities\") pod \"redhat-marketplace-vzsxl\" (UID: \"741cebf1-af9b-4287-9804-47d3c702882d\") " pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.413451 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxqrm\" (UniqueName: \"kubernetes.io/projected/741cebf1-af9b-4287-9804-47d3c702882d-kube-api-access-dxqrm\") pod \"redhat-marketplace-vzsxl\" (UID: \"741cebf1-af9b-4287-9804-47d3c702882d\") " pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.413913 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741cebf1-af9b-4287-9804-47d3c702882d-catalog-content\") pod \"redhat-marketplace-vzsxl\" (UID: \"741cebf1-af9b-4287-9804-47d3c702882d\") " pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:21:48 crc kubenswrapper[4991]: E1006 08:21:48.414032 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:48.914008817 +0000 UTC m=+160.651758838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.414241 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741cebf1-af9b-4287-9804-47d3c702882d-utilities\") pod \"redhat-marketplace-vzsxl\" (UID: \"741cebf1-af9b-4287-9804-47d3c702882d\") " pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.441278 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxqrm\" (UniqueName: \"kubernetes.io/projected/741cebf1-af9b-4287-9804-47d3c702882d-kube-api-access-dxqrm\") pod \"redhat-marketplace-vzsxl\" (UID: \"741cebf1-af9b-4287-9804-47d3c702882d\") " pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.471612 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.514436 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:48 crc kubenswrapper[4991]: E1006 08:21:48.515180 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.015163831 +0000 UTC m=+160.752913852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.561949 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lcvbr"] Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.562153 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" podUID="f4ef6468-c4e0-4a26-820b-ddd444b50a07" containerName="controller-manager" containerID="cri-o://374db1bb08a8fdd28dbe241bb6c2e81d39360e0b1bb0ba55f97d680a4cba2beb" gracePeriod=30 Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.615817 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-png2l\" (UniqueName: \"kubernetes.io/projected/1aab780d-af84-45fa-bc9c-b728d4e196d1-kube-api-access-png2l\") pod \"1aab780d-af84-45fa-bc9c-b728d4e196d1\" (UID: \"1aab780d-af84-45fa-bc9c-b728d4e196d1\") " Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.615957 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aab780d-af84-45fa-bc9c-b728d4e196d1-secret-volume\") pod \"1aab780d-af84-45fa-bc9c-b728d4e196d1\" (UID: \"1aab780d-af84-45fa-bc9c-b728d4e196d1\") " Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.616088 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.616127 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aab780d-af84-45fa-bc9c-b728d4e196d1-config-volume\") pod \"1aab780d-af84-45fa-bc9c-b728d4e196d1\" (UID: \"1aab780d-af84-45fa-bc9c-b728d4e196d1\") " Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.617393 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aab780d-af84-45fa-bc9c-b728d4e196d1-config-volume" (OuterVolumeSpecName: "config-volume") pod "1aab780d-af84-45fa-bc9c-b728d4e196d1" (UID: "1aab780d-af84-45fa-bc9c-b728d4e196d1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:21:48 crc kubenswrapper[4991]: E1006 08:21:48.618629 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.118598607 +0000 UTC m=+160.856348628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.623843 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aab780d-af84-45fa-bc9c-b728d4e196d1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1aab780d-af84-45fa-bc9c-b728d4e196d1" (UID: "1aab780d-af84-45fa-bc9c-b728d4e196d1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.624118 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aab780d-af84-45fa-bc9c-b728d4e196d1-kube-api-access-png2l" (OuterVolumeSpecName: "kube-api-access-png2l") pod "1aab780d-af84-45fa-bc9c-b728d4e196d1" (UID: "1aab780d-af84-45fa-bc9c-b728d4e196d1"). InnerVolumeSpecName "kube-api-access-png2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.674063 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.686320 4991 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.719328 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.719444 4991 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aab780d-af84-45fa-bc9c-b728d4e196d1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.719457 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-png2l\" (UniqueName: \"kubernetes.io/projected/1aab780d-af84-45fa-bc9c-b728d4e196d1-kube-api-access-png2l\") on node \"crc\" DevicePath \"\"" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.719479 4991 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aab780d-af84-45fa-bc9c-b728d4e196d1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:21:48 crc kubenswrapper[4991]: E1006 08:21:48.719745 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.219733249 +0000 UTC m=+160.957483270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.745806 4991 generic.go:334] "Generic (PLEG): container finished" podID="3fd33015-5eee-4441-a373-4b062b28fefd" containerID="9d265bfc34e50ca601495cd0e5f606e6de854ce1f4c5257563a47d92c3b49e2f" exitCode=0 Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.745896 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bch4t" event={"ID":"3fd33015-5eee-4441-a373-4b062b28fefd","Type":"ContainerDied","Data":"9d265bfc34e50ca601495cd0e5f606e6de854ce1f4c5257563a47d92c3b49e2f"} Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.750200 4991 generic.go:334] "Generic (PLEG): container finished" podID="2541e35c-acef-49c6-8117-1eaefe92a7b5" containerID="e0560346920730128655c04711bcac32c0e20551c236c2da15820d2222682c38" exitCode=0 Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.750265 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkf7" event={"ID":"2541e35c-acef-49c6-8117-1eaefe92a7b5","Type":"ContainerDied","Data":"e0560346920730128655c04711bcac32c0e20551c236c2da15820d2222682c38"} Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.750295 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkf7" event={"ID":"2541e35c-acef-49c6-8117-1eaefe92a7b5","Type":"ContainerStarted","Data":"04260e5b89a2b5f0af1e1f8b156b5ad99a054a2a7d7339335b6519c283b311ee"} Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.754718 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pgp24" event={"ID":"6603119d-4075-47ce-9cc8-4d030353dffa","Type":"ContainerStarted","Data":"51c90cc61a8e4ffcccf2049ce0668c69ae0c738c090a0334b8f06f2866938340"} Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.759854 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" event={"ID":"f4ef6468-c4e0-4a26-820b-ddd444b50a07","Type":"ContainerDied","Data":"374db1bb08a8fdd28dbe241bb6c2e81d39360e0b1bb0ba55f97d680a4cba2beb"} Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.759815 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4ef6468-c4e0-4a26-820b-ddd444b50a07" containerID="374db1bb08a8fdd28dbe241bb6c2e81d39360e0b1bb0ba55f97d680a4cba2beb" exitCode=0 Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.766362 4991 patch_prober.go:28] interesting pod/router-default-5444994796-g2mjl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:21:48 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Oct 06 08:21:48 crc kubenswrapper[4991]: [+]process-running ok Oct 06 08:21:48 crc kubenswrapper[4991]: healthz check failed Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.766406 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g2mjl" podUID="20560fe2-fd64-4aa1-9d9c-0f0046a10141" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.767694 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" event={"ID":"1aab780d-af84-45fa-bc9c-b728d4e196d1","Type":"ContainerDied","Data":"ba705eef35c210d8cb9a999a89cdd488581be5e59d74f52c114624d017daae95"} Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.767718 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba705eef35c210d8cb9a999a89cdd488581be5e59d74f52c114624d017daae95" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.767779 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.786190 4991 generic.go:334] "Generic (PLEG): container finished" podID="632906da-50f0-468a-aac9-cb2aea39d813" containerID="f1e6de9ab89ffb3ae16f0e9aed788c0dd68c232181be4c2988f314719b11b52c" exitCode=0 Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.786311 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqgbm" event={"ID":"632906da-50f0-468a-aac9-cb2aea39d813","Type":"ContainerDied","Data":"f1e6de9ab89ffb3ae16f0e9aed788c0dd68c232181be4c2988f314719b11b52c"} Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.786333 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqgbm" event={"ID":"632906da-50f0-468a-aac9-cb2aea39d813","Type":"ContainerStarted","Data":"71beaaa2b540b438c48c7dcd6d76b4b634ae35cc2abb8e1f25eccae335d0d8ad"} Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.804121 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r8pwh"] Oct 06 08:21:48 crc kubenswrapper[4991]: E1006 08:21:48.804427 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aab780d-af84-45fa-bc9c-b728d4e196d1" containerName="collect-profiles" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.804441 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aab780d-af84-45fa-bc9c-b728d4e196d1" containerName="collect-profiles" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.804573 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aab780d-af84-45fa-bc9c-b728d4e196d1" containerName="collect-profiles" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.805543 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.808353 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.820186 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:48 crc kubenswrapper[4991]: E1006 08:21:48.822764 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.321443998 +0000 UTC m=+161.059194019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.823416 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r8pwh"] Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.922924 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee1292d9-c828-4aa7-819b-015bcc128d0b-catalog-content\") pod \"redhat-operators-r8pwh\" (UID: \"ee1292d9-c828-4aa7-819b-015bcc128d0b\") " pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.923140 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee1292d9-c828-4aa7-819b-015bcc128d0b-utilities\") pod \"redhat-operators-r8pwh\" (UID: \"ee1292d9-c828-4aa7-819b-015bcc128d0b\") " pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.923186 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt2nr\" (UniqueName: \"kubernetes.io/projected/ee1292d9-c828-4aa7-819b-015bcc128d0b-kube-api-access-wt2nr\") pod \"redhat-operators-r8pwh\" (UID: \"ee1292d9-c828-4aa7-819b-015bcc128d0b\") " pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.923236 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:48 crc kubenswrapper[4991]: E1006 08:21:48.923721 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.423703972 +0000 UTC m=+161.161453993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:48 crc kubenswrapper[4991]: I1006 08:21:48.978060 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl6f9"] Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.024986 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:49 crc kubenswrapper[4991]: E1006 08:21:49.025199 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.525167223 +0000 UTC m=+161.262917254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.025281 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee1292d9-c828-4aa7-819b-015bcc128d0b-utilities\") pod \"redhat-operators-r8pwh\" (UID: \"ee1292d9-c828-4aa7-819b-015bcc128d0b\") " pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.025325 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt2nr\" (UniqueName: \"kubernetes.io/projected/ee1292d9-c828-4aa7-819b-015bcc128d0b-kube-api-access-wt2nr\") pod \"redhat-operators-r8pwh\" (UID: \"ee1292d9-c828-4aa7-819b-015bcc128d0b\") " pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.025356 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.025404 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee1292d9-c828-4aa7-819b-015bcc128d0b-catalog-content\") pod \"redhat-operators-r8pwh\" (UID: \"ee1292d9-c828-4aa7-819b-015bcc128d0b\") " pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.025805 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee1292d9-c828-4aa7-819b-015bcc128d0b-utilities\") pod \"redhat-operators-r8pwh\" (UID: \"ee1292d9-c828-4aa7-819b-015bcc128d0b\") " pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.025831 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee1292d9-c828-4aa7-819b-015bcc128d0b-catalog-content\") pod \"redhat-operators-r8pwh\" (UID: \"ee1292d9-c828-4aa7-819b-015bcc128d0b\") " pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:21:49 crc kubenswrapper[4991]: E1006 08:21:49.025889 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.525871913 +0000 UTC m=+161.263621934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.046275 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt2nr\" (UniqueName: \"kubernetes.io/projected/ee1292d9-c828-4aa7-819b-015bcc128d0b-kube-api-access-wt2nr\") pod \"redhat-operators-r8pwh\" (UID: \"ee1292d9-c828-4aa7-819b-015bcc128d0b\") " pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.074101 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.080228 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzsxl"] Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.127688 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.133396 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:49 crc kubenswrapper[4991]: E1006 08:21:49.133504 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.633478844 +0000 UTC m=+161.371228865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.133629 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:49 crc kubenswrapper[4991]: E1006 08:21:49.133939 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.633932057 +0000 UTC m=+161.371682078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.209017 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zs6jw"] Oct 06 08:21:49 crc kubenswrapper[4991]: E1006 08:21:49.209240 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ef6468-c4e0-4a26-820b-ddd444b50a07" containerName="controller-manager" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.209253 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ef6468-c4e0-4a26-820b-ddd444b50a07" containerName="controller-manager" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.209374 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4ef6468-c4e0-4a26-820b-ddd444b50a07" containerName="controller-manager" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.210037 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.230180 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zs6jw"] Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.235482 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-289sj\" (UniqueName: \"kubernetes.io/projected/f4ef6468-c4e0-4a26-820b-ddd444b50a07-kube-api-access-289sj\") pod \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.235608 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.235653 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ef6468-c4e0-4a26-820b-ddd444b50a07-serving-cert\") pod \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.235689 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-proxy-ca-bundles\") pod \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " Oct 06 08:21:49 crc kubenswrapper[4991]: E1006 08:21:49.235734 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.735706097 +0000 UTC m=+161.473456118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.235771 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-config\") pod \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.235827 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-client-ca\") pod \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\" (UID: \"f4ef6468-c4e0-4a26-820b-ddd444b50a07\") " Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.236249 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.236408 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f4ef6468-c4e0-4a26-820b-ddd444b50a07" (UID: "f4ef6468-c4e0-4a26-820b-ddd444b50a07"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:21:49 crc kubenswrapper[4991]: E1006 08:21:49.236582 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.736575261 +0000 UTC m=+161.474325282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.236759 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-client-ca" (OuterVolumeSpecName: "client-ca") pod "f4ef6468-c4e0-4a26-820b-ddd444b50a07" (UID: "f4ef6468-c4e0-4a26-820b-ddd444b50a07"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.237005 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-config" (OuterVolumeSpecName: "config") pod "f4ef6468-c4e0-4a26-820b-ddd444b50a07" (UID: "f4ef6468-c4e0-4a26-820b-ddd444b50a07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.241075 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4ef6468-c4e0-4a26-820b-ddd444b50a07-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f4ef6468-c4e0-4a26-820b-ddd444b50a07" (UID: "f4ef6468-c4e0-4a26-820b-ddd444b50a07"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.242402 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4ef6468-c4e0-4a26-820b-ddd444b50a07-kube-api-access-289sj" (OuterVolumeSpecName: "kube-api-access-289sj") pod "f4ef6468-c4e0-4a26-820b-ddd444b50a07" (UID: "f4ef6468-c4e0-4a26-820b-ddd444b50a07"). InnerVolumeSpecName "kube-api-access-289sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.337984 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.338233 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qbwp\" (UniqueName: \"kubernetes.io/projected/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-kube-api-access-5qbwp\") pod \"redhat-operators-zs6jw\" (UID: \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\") " pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:21:49 crc kubenswrapper[4991]: E1006 08:21:49.339184 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.839120903 +0000 UTC m=+161.576870934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.339262 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-catalog-content\") pod \"redhat-operators-zs6jw\" (UID: \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\") " pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.339316 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.339422 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-utilities\") pod \"redhat-operators-zs6jw\" (UID: \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\") " pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.339486 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-289sj\" (UniqueName: \"kubernetes.io/projected/f4ef6468-c4e0-4a26-820b-ddd444b50a07-kube-api-access-289sj\") on node \"crc\" DevicePath \"\"" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.339593 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ef6468-c4e0-4a26-820b-ddd444b50a07-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.339643 4991 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.339661 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.339733 4991 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4ef6468-c4e0-4a26-820b-ddd444b50a07-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:21:49 crc kubenswrapper[4991]: E1006 08:21:49.339944 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 08:21:49.839920774 +0000 UTC m=+161.577670795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zl4k8" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.388867 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r8pwh"] Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.400435 4991 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-06T08:21:48.686354458Z","Handler":null,"Name":""} Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.404704 4991 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.404732 4991 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 06 08:21:49 crc kubenswrapper[4991]: W1006 08:21:49.409530 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee1292d9_c828_4aa7_819b_015bcc128d0b.slice/crio-a97c699a5419a3e3c5996827c633045520ebe75f3e11a30ade9425d654a5c5d4 WatchSource:0}: Error finding container a97c699a5419a3e3c5996827c633045520ebe75f3e11a30ade9425d654a5c5d4: Status 404 returned error can't find the container with id a97c699a5419a3e3c5996827c633045520ebe75f3e11a30ade9425d654a5c5d4 Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.440766 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.441057 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-utilities\") pod \"redhat-operators-zs6jw\" (UID: \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\") " pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.441120 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qbwp\" (UniqueName: \"kubernetes.io/projected/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-kube-api-access-5qbwp\") pod \"redhat-operators-zs6jw\" (UID: \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\") " pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.441182 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-catalog-content\") pod \"redhat-operators-zs6jw\" (UID: \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\") " pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.441587 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-utilities\") pod \"redhat-operators-zs6jw\" (UID: \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\") " pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.441652 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-catalog-content\") pod \"redhat-operators-zs6jw\" (UID: \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\") " pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.445347 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.458248 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qbwp\" (UniqueName: \"kubernetes.io/projected/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-kube-api-access-5qbwp\") pod \"redhat-operators-zs6jw\" (UID: \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\") " pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.542521 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.545639 4991 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.545684 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.562321 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.576932 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zl4k8\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.665692 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.676076 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-d2pr9" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.765748 4991 patch_prober.go:28] interesting pod/router-default-5444994796-g2mjl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:21:49 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Oct 06 08:21:49 crc kubenswrapper[4991]: [+]process-running ok Oct 06 08:21:49 crc kubenswrapper[4991]: healthz check failed Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.766061 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g2mjl" podUID="20560fe2-fd64-4aa1-9d9c-0f0046a10141" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.826465 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz6jp"] Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.828573 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.829720 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pgp24" event={"ID":"6603119d-4075-47ce-9cc8-4d030353dffa","Type":"ContainerStarted","Data":"0e0e7a578fb3d69c24f44cc79b46886f5e841cfd6fab0585b816fb0e0a2d9ef4"} Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.836984 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz6jp"] Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.840656 4991 generic.go:334] "Generic (PLEG): container finished" podID="741cebf1-af9b-4287-9804-47d3c702882d" containerID="601c6534660abfb8f033417713520e07372b18a95ca983d897d3ccfbb1d01fbd" exitCode=0 Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.840737 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzsxl" event={"ID":"741cebf1-af9b-4287-9804-47d3c702882d","Type":"ContainerDied","Data":"601c6534660abfb8f033417713520e07372b18a95ca983d897d3ccfbb1d01fbd"} Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.840768 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzsxl" event={"ID":"741cebf1-af9b-4287-9804-47d3c702882d","Type":"ContainerStarted","Data":"6746254f7a38d4766db244216060b8cbec2d169bfca0a6d154f924cb2728e714"} Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.850706 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.850946 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lcvbr" event={"ID":"f4ef6468-c4e0-4a26-820b-ddd444b50a07","Type":"ContainerDied","Data":"39b9f7887a1e5bc1b35e009576c3819df509ca66efa1ddd790caff9103c6f59c"} Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.851037 4991 scope.go:117] "RemoveContainer" containerID="374db1bb08a8fdd28dbe241bb6c2e81d39360e0b1bb0ba55f97d680a4cba2beb" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.877567 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.882627 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-pgp24" podStartSLOduration=11.88260574 podStartE2EDuration="11.88260574s" podCreationTimestamp="2025-10-06 08:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:49.879731531 +0000 UTC m=+161.617481552" watchObservedRunningTime="2025-10-06 08:21:49.88260574 +0000 UTC m=+161.620355771" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.889962 4991 generic.go:334] "Generic (PLEG): container finished" podID="ee1292d9-c828-4aa7-819b-015bcc128d0b" containerID="ea46516d66194dd4d896dbf340b5d68689a97c03bc1d034f8d1338c64932d085" exitCode=0 Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.890078 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8pwh" event={"ID":"ee1292d9-c828-4aa7-819b-015bcc128d0b","Type":"ContainerDied","Data":"ea46516d66194dd4d896dbf340b5d68689a97c03bc1d034f8d1338c64932d085"} Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.890131 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8pwh" event={"ID":"ee1292d9-c828-4aa7-819b-015bcc128d0b","Type":"ContainerStarted","Data":"a97c699a5419a3e3c5996827c633045520ebe75f3e11a30ade9425d654a5c5d4"} Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.905238 4991 generic.go:334] "Generic (PLEG): container finished" podID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" containerID="ad8e9fa68d2363fe06f239a176357af6c4b3ce973c8f022dea9d0f7ea4791b80" exitCode=0 Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.905322 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl6f9" event={"ID":"6f4da1f1-397f-4cb5-af9d-cb28306486a5","Type":"ContainerDied","Data":"ad8e9fa68d2363fe06f239a176357af6c4b3ce973c8f022dea9d0f7ea4791b80"} Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.905376 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl6f9" event={"ID":"6f4da1f1-397f-4cb5-af9d-cb28306486a5","Type":"ContainerStarted","Data":"9f68343f6c4da1c4eeb427d2a26b3c54c6c6a8d75ba00e3dbb9ce6235bf91efc"} Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.906831 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lcvbr"] Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.914506 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lcvbr"] Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.970007 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-client-ca\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.970055 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-config\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.970087 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76drl\" (UniqueName: \"kubernetes.io/projected/4a605716-cfa0-49ed-826e-bb9b2cd4d834-kube-api-access-76drl\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.970129 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:49 crc kubenswrapper[4991]: I1006 08:21:49.970156 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a605716-cfa0-49ed-826e-bb9b2cd4d834-serving-cert\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.029545 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.030211 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.037768 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.038002 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.048886 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.071379 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-client-ca\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.071416 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-config\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.071453 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76drl\" (UniqueName: \"kubernetes.io/projected/4a605716-cfa0-49ed-826e-bb9b2cd4d834-kube-api-access-76drl\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.071525 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.071551 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a605716-cfa0-49ed-826e-bb9b2cd4d834-serving-cert\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.073108 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-config\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.073124 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.073478 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-client-ca\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.078377 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a605716-cfa0-49ed-826e-bb9b2cd4d834-serving-cert\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.094967 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76drl\" (UniqueName: \"kubernetes.io/projected/4a605716-cfa0-49ed-826e-bb9b2cd4d834-kube-api-access-76drl\") pod \"controller-manager-879f6c89f-pz6jp\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.148499 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zs6jw"] Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.156197 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.187059 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e532e0b-f8e0-4f4e-a42f-22f944b9814f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0e532e0b-f8e0-4f4e-a42f-22f944b9814f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.187160 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e532e0b-f8e0-4f4e-a42f-22f944b9814f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0e532e0b-f8e0-4f4e-a42f-22f944b9814f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.287956 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e532e0b-f8e0-4f4e-a42f-22f944b9814f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0e532e0b-f8e0-4f4e-a42f-22f944b9814f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.288158 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e532e0b-f8e0-4f4e-a42f-22f944b9814f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0e532e0b-f8e0-4f4e-a42f-22f944b9814f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.288592 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e532e0b-f8e0-4f4e-a42f-22f944b9814f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0e532e0b-f8e0-4f4e-a42f-22f944b9814f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.308574 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e532e0b-f8e0-4f4e-a42f-22f944b9814f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0e532e0b-f8e0-4f4e-a42f-22f944b9814f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.332288 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zl4k8"] Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.358869 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.524832 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz6jp"] Oct 06 08:21:50 crc kubenswrapper[4991]: W1006 08:21:50.596439 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a605716_cfa0_49ed_826e_bb9b2cd4d834.slice/crio-5798e322bd783c09421f45c2f0867315f2abf1ce387a8b2466521d160ffcfd58 WatchSource:0}: Error finding container 5798e322bd783c09421f45c2f0867315f2abf1ce387a8b2466521d160ffcfd58: Status 404 returned error can't find the container with id 5798e322bd783c09421f45c2f0867315f2abf1ce387a8b2466521d160ffcfd58 Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.763714 4991 patch_prober.go:28] interesting pod/router-default-5444994796-g2mjl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:21:50 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Oct 06 08:21:50 crc kubenswrapper[4991]: [+]process-running ok Oct 06 08:21:50 crc kubenswrapper[4991]: healthz check failed Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.764321 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g2mjl" podUID="20560fe2-fd64-4aa1-9d9c-0f0046a10141" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.926700 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.927421 4991 generic.go:334] "Generic (PLEG): container finished" podID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" containerID="e83054da1dbe0adcf4f07f78fc73e9dec7df87b0f36ca38235b8b77713fd45ef" exitCode=0 Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.927575 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs6jw" event={"ID":"94d0cbbb-3329-45c8-8c99-f49fc4068d6d","Type":"ContainerDied","Data":"e83054da1dbe0adcf4f07f78fc73e9dec7df87b0f36ca38235b8b77713fd45ef"} Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.927618 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.927636 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs6jw" event={"ID":"94d0cbbb-3329-45c8-8c99-f49fc4068d6d","Type":"ContainerStarted","Data":"1266b3136ec594b8d4cba72ec3ef30fd9ea9f07c94e40e068f37690c5a41ba81"} Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.940395 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.962442 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" event={"ID":"fb83cb02-67d8-4f38-aad6-001ea28de60a","Type":"ContainerStarted","Data":"b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245"} Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.962483 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" event={"ID":"fb83cb02-67d8-4f38-aad6-001ea28de60a","Type":"ContainerStarted","Data":"6496040adc61a7bae86d50e9d9ead6a70f461991343f51b2c4bbbf2814ec0e0b"} Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.962596 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.970751 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" event={"ID":"4a605716-cfa0-49ed-826e-bb9b2cd4d834","Type":"ContainerStarted","Data":"c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d"} Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.970793 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" event={"ID":"4a605716-cfa0-49ed-826e-bb9b2cd4d834","Type":"ContainerStarted","Data":"5798e322bd783c09421f45c2f0867315f2abf1ce387a8b2466521d160ffcfd58"} Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.972158 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:50 crc kubenswrapper[4991]: I1006 08:21:50.991793 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:50.998512 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.015251 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" podStartSLOduration=3.015231916 podStartE2EDuration="3.015231916s" podCreationTimestamp="2025-10-06 08:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:51.007925024 +0000 UTC m=+162.745675045" watchObservedRunningTime="2025-10-06 08:21:51.015231916 +0000 UTC m=+162.752981937" Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.058347 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" podStartSLOduration=137.058325146 podStartE2EDuration="2m17.058325146s" podCreationTimestamp="2025-10-06 08:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:51.032635996 +0000 UTC m=+162.770386027" watchObservedRunningTime="2025-10-06 08:21:51.058325146 +0000 UTC m=+162.796075467" Oct 06 08:21:51 crc kubenswrapper[4991]: W1006 08:21:51.068908 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0e532e0b_f8e0_4f4e_a42f_22f944b9814f.slice/crio-20bdcc4428223b1a54b976fcb6899ae868618c406ee0efd7af2f1a6c694eb3c3 WatchSource:0}: Error finding container 20bdcc4428223b1a54b976fcb6899ae868618c406ee0efd7af2f1a6c694eb3c3: Status 404 returned error can't find the container with id 20bdcc4428223b1a54b976fcb6899ae868618c406ee0efd7af2f1a6c694eb3c3 Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.149950 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.150871 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.160719 4991 patch_prober.go:28] interesting pod/console-f9d7485db-kp5gc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.160805 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-kp5gc" podUID="c941e944-a837-41ff-90b0-29464fc3f02d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.270173 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.270739 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4ef6468-c4e0-4a26-820b-ddd444b50a07" path="/var/lib/kubelet/pods/f4ef6468-c4e0-4a26-820b-ddd444b50a07/volumes" Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.553509 4991 patch_prober.go:28] interesting pod/downloads-7954f5f757-nxbxt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.553830 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nxbxt" podUID="bde59828-827b-4873-b51d-34038c9ab9ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.554033 4991 patch_prober.go:28] interesting pod/downloads-7954f5f757-nxbxt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.554083 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nxbxt" podUID="bde59828-827b-4873-b51d-34038c9ab9ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.563850 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.760488 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.762322 4991 patch_prober.go:28] interesting pod/router-default-5444994796-g2mjl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:21:51 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Oct 06 08:21:51 crc kubenswrapper[4991]: [+]process-running ok Oct 06 08:21:51 crc kubenswrapper[4991]: healthz check failed Oct 06 08:21:51 crc kubenswrapper[4991]: I1006 08:21:51.762382 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g2mjl" podUID="20560fe2-fd64-4aa1-9d9c-0f0046a10141" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:21:52 crc kubenswrapper[4991]: I1006 08:21:52.033423 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0e532e0b-f8e0-4f4e-a42f-22f944b9814f","Type":"ContainerStarted","Data":"20bdcc4428223b1a54b976fcb6899ae868618c406ee0efd7af2f1a6c694eb3c3"} Oct 06 08:21:52 crc kubenswrapper[4991]: I1006 08:21:52.050099 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zlfb2" Oct 06 08:21:52 crc kubenswrapper[4991]: I1006 08:21:52.058456 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.058428682 podStartE2EDuration="2.058428682s" podCreationTimestamp="2025-10-06 08:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:52.056889249 +0000 UTC m=+163.794639270" watchObservedRunningTime="2025-10-06 08:21:52.058428682 +0000 UTC m=+163.796178703" Oct 06 08:21:52 crc kubenswrapper[4991]: I1006 08:21:52.760842 4991 patch_prober.go:28] interesting pod/router-default-5444994796-g2mjl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 08:21:52 crc kubenswrapper[4991]: [-]has-synced failed: reason withheld Oct 06 08:21:52 crc kubenswrapper[4991]: [+]process-running ok Oct 06 08:21:52 crc kubenswrapper[4991]: healthz check failed Oct 06 08:21:52 crc kubenswrapper[4991]: I1006 08:21:52.761166 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-g2mjl" podUID="20560fe2-fd64-4aa1-9d9c-0f0046a10141" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.078988 4991 generic.go:334] "Generic (PLEG): container finished" podID="0e532e0b-f8e0-4f4e-a42f-22f944b9814f" containerID="b710077d6cb15139b6a7cf67ee8a380003cb7270b354bc43cfd6afc863934ea3" exitCode=0 Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.079056 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0e532e0b-f8e0-4f4e-a42f-22f944b9814f","Type":"ContainerDied","Data":"b710077d6cb15139b6a7cf67ee8a380003cb7270b354bc43cfd6afc863934ea3"} Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.497835 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9nxks" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.552684 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.553673 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.556311 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.556529 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.579139 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.690671 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/317a739a-78c6-4089-8bc2-a8e3a0388522-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"317a739a-78c6-4089-8bc2-a8e3a0388522\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.690765 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/317a739a-78c6-4089-8bc2-a8e3a0388522-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"317a739a-78c6-4089-8bc2-a8e3a0388522\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.762869 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.771086 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-g2mjl" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.792452 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/317a739a-78c6-4089-8bc2-a8e3a0388522-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"317a739a-78c6-4089-8bc2-a8e3a0388522\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.792612 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/317a739a-78c6-4089-8bc2-a8e3a0388522-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"317a739a-78c6-4089-8bc2-a8e3a0388522\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.792667 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/317a739a-78c6-4089-8bc2-a8e3a0388522-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"317a739a-78c6-4089-8bc2-a8e3a0388522\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.824275 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/317a739a-78c6-4089-8bc2-a8e3a0388522-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"317a739a-78c6-4089-8bc2-a8e3a0388522\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:21:53 crc kubenswrapper[4991]: I1006 08:21:53.906349 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:21:54 crc kubenswrapper[4991]: I1006 08:21:54.255698 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 08:21:54 crc kubenswrapper[4991]: I1006 08:21:54.450547 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:21:54 crc kubenswrapper[4991]: I1006 08:21:54.618715 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e532e0b-f8e0-4f4e-a42f-22f944b9814f-kube-api-access\") pod \"0e532e0b-f8e0-4f4e-a42f-22f944b9814f\" (UID: \"0e532e0b-f8e0-4f4e-a42f-22f944b9814f\") " Oct 06 08:21:54 crc kubenswrapper[4991]: I1006 08:21:54.619150 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e532e0b-f8e0-4f4e-a42f-22f944b9814f-kubelet-dir\") pod \"0e532e0b-f8e0-4f4e-a42f-22f944b9814f\" (UID: \"0e532e0b-f8e0-4f4e-a42f-22f944b9814f\") " Oct 06 08:21:54 crc kubenswrapper[4991]: I1006 08:21:54.619246 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e532e0b-f8e0-4f4e-a42f-22f944b9814f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0e532e0b-f8e0-4f4e-a42f-22f944b9814f" (UID: "0e532e0b-f8e0-4f4e-a42f-22f944b9814f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:21:54 crc kubenswrapper[4991]: I1006 08:21:54.619488 4991 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e532e0b-f8e0-4f4e-a42f-22f944b9814f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 08:21:54 crc kubenswrapper[4991]: I1006 08:21:54.623240 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e532e0b-f8e0-4f4e-a42f-22f944b9814f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0e532e0b-f8e0-4f4e-a42f-22f944b9814f" (UID: "0e532e0b-f8e0-4f4e-a42f-22f944b9814f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:21:54 crc kubenswrapper[4991]: I1006 08:21:54.721133 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e532e0b-f8e0-4f4e-a42f-22f944b9814f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:21:55 crc kubenswrapper[4991]: I1006 08:21:55.097671 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"317a739a-78c6-4089-8bc2-a8e3a0388522","Type":"ContainerStarted","Data":"1e7a3f1dacbd9046b1389300afce30b8d7e4895a18ad4e8a8ced5a43f13869d2"} Oct 06 08:21:55 crc kubenswrapper[4991]: I1006 08:21:55.105948 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0e532e0b-f8e0-4f4e-a42f-22f944b9814f","Type":"ContainerDied","Data":"20bdcc4428223b1a54b976fcb6899ae868618c406ee0efd7af2f1a6c694eb3c3"} Oct 06 08:21:55 crc kubenswrapper[4991]: I1006 08:21:55.105994 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20bdcc4428223b1a54b976fcb6899ae868618c406ee0efd7af2f1a6c694eb3c3" Oct 06 08:21:55 crc kubenswrapper[4991]: I1006 08:21:55.106049 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 08:21:56 crc kubenswrapper[4991]: I1006 08:21:56.115526 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"317a739a-78c6-4089-8bc2-a8e3a0388522","Type":"ContainerStarted","Data":"6ec58b42487ff88b5b73531deadfde217b2af377662d37f34f8ce0812b22ad81"} Oct 06 08:21:56 crc kubenswrapper[4991]: I1006 08:21:56.136233 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.136216274 podStartE2EDuration="3.136216274s" podCreationTimestamp="2025-10-06 08:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:21:56.132492581 +0000 UTC m=+167.870242602" watchObservedRunningTime="2025-10-06 08:21:56.136216274 +0000 UTC m=+167.873966295" Oct 06 08:21:56 crc kubenswrapper[4991]: I1006 08:21:56.556673 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:56 crc kubenswrapper[4991]: I1006 08:21:56.562968 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e38e446-d0d7-463a-987a-110a8e95fe84-metrics-certs\") pod \"network-metrics-daemon-787zw\" (UID: \"3e38e446-d0d7-463a-987a-110a8e95fe84\") " pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:56 crc kubenswrapper[4991]: I1006 08:21:56.775940 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-787zw" Oct 06 08:21:57 crc kubenswrapper[4991]: I1006 08:21:57.130552 4991 generic.go:334] "Generic (PLEG): container finished" podID="317a739a-78c6-4089-8bc2-a8e3a0388522" containerID="6ec58b42487ff88b5b73531deadfde217b2af377662d37f34f8ce0812b22ad81" exitCode=0 Oct 06 08:21:57 crc kubenswrapper[4991]: I1006 08:21:57.130609 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"317a739a-78c6-4089-8bc2-a8e3a0388522","Type":"ContainerDied","Data":"6ec58b42487ff88b5b73531deadfde217b2af377662d37f34f8ce0812b22ad81"} Oct 06 08:21:57 crc kubenswrapper[4991]: I1006 08:21:57.530043 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:21:57 crc kubenswrapper[4991]: I1006 08:21:57.530120 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:22:01 crc kubenswrapper[4991]: I1006 08:22:01.153245 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:22:01 crc kubenswrapper[4991]: I1006 08:22:01.157311 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:22:01 crc kubenswrapper[4991]: I1006 08:22:01.559509 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-nxbxt" Oct 06 08:22:02 crc kubenswrapper[4991]: I1006 08:22:02.427710 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:22:02 crc kubenswrapper[4991]: I1006 08:22:02.461069 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/317a739a-78c6-4089-8bc2-a8e3a0388522-kube-api-access\") pod \"317a739a-78c6-4089-8bc2-a8e3a0388522\" (UID: \"317a739a-78c6-4089-8bc2-a8e3a0388522\") " Oct 06 08:22:02 crc kubenswrapper[4991]: I1006 08:22:02.461153 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/317a739a-78c6-4089-8bc2-a8e3a0388522-kubelet-dir\") pod \"317a739a-78c6-4089-8bc2-a8e3a0388522\" (UID: \"317a739a-78c6-4089-8bc2-a8e3a0388522\") " Oct 06 08:22:02 crc kubenswrapper[4991]: I1006 08:22:02.461505 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/317a739a-78c6-4089-8bc2-a8e3a0388522-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "317a739a-78c6-4089-8bc2-a8e3a0388522" (UID: "317a739a-78c6-4089-8bc2-a8e3a0388522"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:22:02 crc kubenswrapper[4991]: I1006 08:22:02.474634 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317a739a-78c6-4089-8bc2-a8e3a0388522-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "317a739a-78c6-4089-8bc2-a8e3a0388522" (UID: "317a739a-78c6-4089-8bc2-a8e3a0388522"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:02 crc kubenswrapper[4991]: I1006 08:22:02.565482 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/317a739a-78c6-4089-8bc2-a8e3a0388522-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:02 crc kubenswrapper[4991]: I1006 08:22:02.565513 4991 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/317a739a-78c6-4089-8bc2-a8e3a0388522-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:03 crc kubenswrapper[4991]: I1006 08:22:03.191025 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"317a739a-78c6-4089-8bc2-a8e3a0388522","Type":"ContainerDied","Data":"1e7a3f1dacbd9046b1389300afce30b8d7e4895a18ad4e8a8ced5a43f13869d2"} Oct 06 08:22:03 crc kubenswrapper[4991]: I1006 08:22:03.191078 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e7a3f1dacbd9046b1389300afce30b8d7e4895a18ad4e8a8ced5a43f13869d2" Oct 06 08:22:03 crc kubenswrapper[4991]: I1006 08:22:03.191190 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 08:22:07 crc kubenswrapper[4991]: I1006 08:22:07.388875 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 08:22:09 crc kubenswrapper[4991]: I1006 08:22:09.891801 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:22:14 crc kubenswrapper[4991]: E1006 08:22:14.835500 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 06 08:22:14 crc kubenswrapper[4991]: E1006 08:22:14.836709 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5qbwp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zs6jw_openshift-marketplace(94d0cbbb-3329-45c8-8c99-f49fc4068d6d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:22:14 crc kubenswrapper[4991]: E1006 08:22:14.838018 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zs6jw" podUID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" Oct 06 08:22:15 crc kubenswrapper[4991]: E1006 08:22:15.892639 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zs6jw" podUID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" Oct 06 08:22:15 crc kubenswrapper[4991]: E1006 08:22:15.958331 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 06 08:22:15 crc kubenswrapper[4991]: E1006 08:22:15.958534 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tg8fv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bch4t_openshift-marketplace(3fd33015-5eee-4441-a373-4b062b28fefd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:22:15 crc kubenswrapper[4991]: E1006 08:22:15.960929 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bch4t" podUID="3fd33015-5eee-4441-a373-4b062b28fefd" Oct 06 08:22:18 crc kubenswrapper[4991]: E1006 08:22:18.261082 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bch4t" podUID="3fd33015-5eee-4441-a373-4b062b28fefd" Oct 06 08:22:18 crc kubenswrapper[4991]: I1006 08:22:18.431905 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-787zw"] Oct 06 08:22:18 crc kubenswrapper[4991]: E1006 08:22:18.477405 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 06 08:22:18 crc kubenswrapper[4991]: E1006 08:22:18.477556 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-srrtd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-g55zd_openshift-marketplace(7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:22:18 crc kubenswrapper[4991]: E1006 08:22:18.478726 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-g55zd" podUID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" Oct 06 08:22:19 crc kubenswrapper[4991]: W1006 08:22:19.136764 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e38e446_d0d7_463a_987a_110a8e95fe84.slice/crio-20dd69111dae8d3af59327494f09428aa2c4e00d1bd02c7180ee49057a4a4639 WatchSource:0}: Error finding container 20dd69111dae8d3af59327494f09428aa2c4e00d1bd02c7180ee49057a4a4639: Status 404 returned error can't find the container with id 20dd69111dae8d3af59327494f09428aa2c4e00d1bd02c7180ee49057a4a4639 Oct 06 08:22:19 crc kubenswrapper[4991]: E1006 08:22:19.156868 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 08:22:19 crc kubenswrapper[4991]: E1006 08:22:19.157009 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x5tnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hl6f9_openshift-marketplace(6f4da1f1-397f-4cb5-af9d-cb28306486a5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:22:19 crc kubenswrapper[4991]: E1006 08:22:19.158327 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hl6f9" podUID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" Oct 06 08:22:19 crc kubenswrapper[4991]: E1006 08:22:19.227740 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 08:22:19 crc kubenswrapper[4991]: E1006 08:22:19.228259 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dxqrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vzsxl_openshift-marketplace(741cebf1-af9b-4287-9804-47d3c702882d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 08:22:19 crc kubenswrapper[4991]: E1006 08:22:19.229541 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vzsxl" podUID="741cebf1-af9b-4287-9804-47d3c702882d" Oct 06 08:22:19 crc kubenswrapper[4991]: I1006 08:22:19.322616 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkf7" event={"ID":"2541e35c-acef-49c6-8117-1eaefe92a7b5","Type":"ContainerStarted","Data":"a3b89122333a5afba62f4912910ffb332d4015a07e4358dd771fca990c51ecbc"} Oct 06 08:22:19 crc kubenswrapper[4991]: I1006 08:22:19.327169 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-787zw" event={"ID":"3e38e446-d0d7-463a-987a-110a8e95fe84","Type":"ContainerStarted","Data":"20dd69111dae8d3af59327494f09428aa2c4e00d1bd02c7180ee49057a4a4639"} Oct 06 08:22:19 crc kubenswrapper[4991]: E1006 08:22:19.328894 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hl6f9" podUID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" Oct 06 08:22:19 crc kubenswrapper[4991]: E1006 08:22:19.329332 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vzsxl" podUID="741cebf1-af9b-4287-9804-47d3c702882d" Oct 06 08:22:19 crc kubenswrapper[4991]: E1006 08:22:19.330106 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-g55zd" podUID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" Oct 06 08:22:20 crc kubenswrapper[4991]: I1006 08:22:20.337796 4991 generic.go:334] "Generic (PLEG): container finished" podID="2541e35c-acef-49c6-8117-1eaefe92a7b5" containerID="a3b89122333a5afba62f4912910ffb332d4015a07e4358dd771fca990c51ecbc" exitCode=0 Oct 06 08:22:20 crc kubenswrapper[4991]: I1006 08:22:20.339263 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkf7" event={"ID":"2541e35c-acef-49c6-8117-1eaefe92a7b5","Type":"ContainerDied","Data":"a3b89122333a5afba62f4912910ffb332d4015a07e4358dd771fca990c51ecbc"} Oct 06 08:22:20 crc kubenswrapper[4991]: I1006 08:22:20.345234 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-787zw" event={"ID":"3e38e446-d0d7-463a-987a-110a8e95fe84","Type":"ContainerStarted","Data":"09b91e0e42766162039646fd1db04ec820bad745418ccd161eb385c4eb1f3e78"} Oct 06 08:22:20 crc kubenswrapper[4991]: I1006 08:22:20.345264 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-787zw" event={"ID":"3e38e446-d0d7-463a-987a-110a8e95fe84","Type":"ContainerStarted","Data":"9f47a8008de312f2411de582a9ae44cb41542993a3de9ba6e0cf02ca8509a08e"} Oct 06 08:22:20 crc kubenswrapper[4991]: I1006 08:22:20.350703 4991 generic.go:334] "Generic (PLEG): container finished" podID="632906da-50f0-468a-aac9-cb2aea39d813" containerID="ca43580d823c71ccbd431f1621618e9336b5dd6334cb4439d1e5919f6701e5c1" exitCode=0 Oct 06 08:22:20 crc kubenswrapper[4991]: I1006 08:22:20.351343 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqgbm" event={"ID":"632906da-50f0-468a-aac9-cb2aea39d813","Type":"ContainerDied","Data":"ca43580d823c71ccbd431f1621618e9336b5dd6334cb4439d1e5919f6701e5c1"} Oct 06 08:22:20 crc kubenswrapper[4991]: I1006 08:22:20.366636 4991 generic.go:334] "Generic (PLEG): container finished" podID="ee1292d9-c828-4aa7-819b-015bcc128d0b" containerID="66da844d3b9f1ba2a7d2b1ca57555b4525d675a206fd9dd8e7dca77a389a858c" exitCode=0 Oct 06 08:22:20 crc kubenswrapper[4991]: I1006 08:22:20.366765 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8pwh" event={"ID":"ee1292d9-c828-4aa7-819b-015bcc128d0b","Type":"ContainerDied","Data":"66da844d3b9f1ba2a7d2b1ca57555b4525d675a206fd9dd8e7dca77a389a858c"} Oct 06 08:22:20 crc kubenswrapper[4991]: I1006 08:22:20.404879 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-787zw" podStartSLOduration=167.404818201 podStartE2EDuration="2m47.404818201s" podCreationTimestamp="2025-10-06 08:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:22:20.402858497 +0000 UTC m=+192.140608558" watchObservedRunningTime="2025-10-06 08:22:20.404818201 +0000 UTC m=+192.142568252" Oct 06 08:22:21 crc kubenswrapper[4991]: I1006 08:22:21.390358 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkf7" event={"ID":"2541e35c-acef-49c6-8117-1eaefe92a7b5","Type":"ContainerStarted","Data":"6dcd138061324a5709521ac47c689640f3b46c24b2717b245143bb696a53108e"} Oct 06 08:22:21 crc kubenswrapper[4991]: I1006 08:22:21.393292 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqgbm" event={"ID":"632906da-50f0-468a-aac9-cb2aea39d813","Type":"ContainerStarted","Data":"19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236"} Oct 06 08:22:21 crc kubenswrapper[4991]: I1006 08:22:21.717923 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2d9tn" Oct 06 08:22:22 crc kubenswrapper[4991]: I1006 08:22:22.403764 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8pwh" event={"ID":"ee1292d9-c828-4aa7-819b-015bcc128d0b","Type":"ContainerStarted","Data":"d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d"} Oct 06 08:22:22 crc kubenswrapper[4991]: I1006 08:22:22.430132 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wgkf7" podStartSLOduration=4.402624079 podStartE2EDuration="36.430113256s" podCreationTimestamp="2025-10-06 08:21:46 +0000 UTC" firstStartedPulling="2025-10-06 08:21:48.765788511 +0000 UTC m=+160.503538532" lastFinishedPulling="2025-10-06 08:22:20.793277688 +0000 UTC m=+192.531027709" observedRunningTime="2025-10-06 08:22:22.428707628 +0000 UTC m=+194.166457649" watchObservedRunningTime="2025-10-06 08:22:22.430113256 +0000 UTC m=+194.167863277" Oct 06 08:22:22 crc kubenswrapper[4991]: I1006 08:22:22.451902 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r8pwh" podStartSLOduration=3.486311783 podStartE2EDuration="34.451878797s" podCreationTimestamp="2025-10-06 08:21:48 +0000 UTC" firstStartedPulling="2025-10-06 08:21:49.899587759 +0000 UTC m=+161.637337780" lastFinishedPulling="2025-10-06 08:22:20.865154773 +0000 UTC m=+192.602904794" observedRunningTime="2025-10-06 08:22:22.449162642 +0000 UTC m=+194.186912663" watchObservedRunningTime="2025-10-06 08:22:22.451878797 +0000 UTC m=+194.189628818" Oct 06 08:22:22 crc kubenswrapper[4991]: I1006 08:22:22.468367 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fqgbm" podStartSLOduration=5.356585249 podStartE2EDuration="37.468342402s" podCreationTimestamp="2025-10-06 08:21:45 +0000 UTC" firstStartedPulling="2025-10-06 08:21:48.787757268 +0000 UTC m=+160.525507289" lastFinishedPulling="2025-10-06 08:22:20.899514401 +0000 UTC m=+192.637264442" observedRunningTime="2025-10-06 08:22:22.466238124 +0000 UTC m=+194.203988145" watchObservedRunningTime="2025-10-06 08:22:22.468342402 +0000 UTC m=+194.206092413" Oct 06 08:22:26 crc kubenswrapper[4991]: I1006 08:22:26.984467 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:22:26 crc kubenswrapper[4991]: I1006 08:22:26.985204 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:22:26 crc kubenswrapper[4991]: I1006 08:22:26.985224 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:22:26 crc kubenswrapper[4991]: I1006 08:22:26.985241 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:22:27 crc kubenswrapper[4991]: I1006 08:22:27.445774 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:22:27 crc kubenswrapper[4991]: I1006 08:22:27.449401 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:22:27 crc kubenswrapper[4991]: I1006 08:22:27.515849 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:22:27 crc kubenswrapper[4991]: I1006 08:22:27.523979 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:22:27 crc kubenswrapper[4991]: I1006 08:22:27.528649 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:22:27 crc kubenswrapper[4991]: I1006 08:22:27.528793 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:22:28 crc kubenswrapper[4991]: I1006 08:22:28.531549 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wgkf7"] Oct 06 08:22:29 crc kubenswrapper[4991]: I1006 08:22:29.127902 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:22:29 crc kubenswrapper[4991]: I1006 08:22:29.128038 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:22:29 crc kubenswrapper[4991]: I1006 08:22:29.199636 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:22:29 crc kubenswrapper[4991]: I1006 08:22:29.448609 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wgkf7" podUID="2541e35c-acef-49c6-8117-1eaefe92a7b5" containerName="registry-server" containerID="cri-o://6dcd138061324a5709521ac47c689640f3b46c24b2717b245143bb696a53108e" gracePeriod=2 Oct 06 08:22:29 crc kubenswrapper[4991]: I1006 08:22:29.509920 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:22:30 crc kubenswrapper[4991]: I1006 08:22:30.456431 4991 generic.go:334] "Generic (PLEG): container finished" podID="2541e35c-acef-49c6-8117-1eaefe92a7b5" containerID="6dcd138061324a5709521ac47c689640f3b46c24b2717b245143bb696a53108e" exitCode=0 Oct 06 08:22:30 crc kubenswrapper[4991]: I1006 08:22:30.456486 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkf7" event={"ID":"2541e35c-acef-49c6-8117-1eaefe92a7b5","Type":"ContainerDied","Data":"6dcd138061324a5709521ac47c689640f3b46c24b2717b245143bb696a53108e"} Oct 06 08:22:30 crc kubenswrapper[4991]: I1006 08:22:30.761950 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:22:30 crc kubenswrapper[4991]: I1006 08:22:30.817548 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2541e35c-acef-49c6-8117-1eaefe92a7b5-utilities\") pod \"2541e35c-acef-49c6-8117-1eaefe92a7b5\" (UID: \"2541e35c-acef-49c6-8117-1eaefe92a7b5\") " Oct 06 08:22:30 crc kubenswrapper[4991]: I1006 08:22:30.817614 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j82bv\" (UniqueName: \"kubernetes.io/projected/2541e35c-acef-49c6-8117-1eaefe92a7b5-kube-api-access-j82bv\") pod \"2541e35c-acef-49c6-8117-1eaefe92a7b5\" (UID: \"2541e35c-acef-49c6-8117-1eaefe92a7b5\") " Oct 06 08:22:30 crc kubenswrapper[4991]: I1006 08:22:30.817675 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2541e35c-acef-49c6-8117-1eaefe92a7b5-catalog-content\") pod \"2541e35c-acef-49c6-8117-1eaefe92a7b5\" (UID: \"2541e35c-acef-49c6-8117-1eaefe92a7b5\") " Oct 06 08:22:30 crc kubenswrapper[4991]: I1006 08:22:30.818413 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2541e35c-acef-49c6-8117-1eaefe92a7b5-utilities" (OuterVolumeSpecName: "utilities") pod "2541e35c-acef-49c6-8117-1eaefe92a7b5" (UID: "2541e35c-acef-49c6-8117-1eaefe92a7b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:30 crc kubenswrapper[4991]: I1006 08:22:30.837545 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2541e35c-acef-49c6-8117-1eaefe92a7b5-kube-api-access-j82bv" (OuterVolumeSpecName: "kube-api-access-j82bv") pod "2541e35c-acef-49c6-8117-1eaefe92a7b5" (UID: "2541e35c-acef-49c6-8117-1eaefe92a7b5"). InnerVolumeSpecName "kube-api-access-j82bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:30 crc kubenswrapper[4991]: I1006 08:22:30.919446 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2541e35c-acef-49c6-8117-1eaefe92a7b5-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:30 crc kubenswrapper[4991]: I1006 08:22:30.919728 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j82bv\" (UniqueName: \"kubernetes.io/projected/2541e35c-acef-49c6-8117-1eaefe92a7b5-kube-api-access-j82bv\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:30 crc kubenswrapper[4991]: I1006 08:22:30.964072 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2541e35c-acef-49c6-8117-1eaefe92a7b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2541e35c-acef-49c6-8117-1eaefe92a7b5" (UID: "2541e35c-acef-49c6-8117-1eaefe92a7b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:31 crc kubenswrapper[4991]: I1006 08:22:31.021152 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2541e35c-acef-49c6-8117-1eaefe92a7b5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:31 crc kubenswrapper[4991]: I1006 08:22:31.462648 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgkf7" event={"ID":"2541e35c-acef-49c6-8117-1eaefe92a7b5","Type":"ContainerDied","Data":"04260e5b89a2b5f0af1e1f8b156b5ad99a054a2a7d7339335b6519c283b311ee"} Oct 06 08:22:31 crc kubenswrapper[4991]: I1006 08:22:31.462686 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgkf7" Oct 06 08:22:31 crc kubenswrapper[4991]: I1006 08:22:31.462721 4991 scope.go:117] "RemoveContainer" containerID="6dcd138061324a5709521ac47c689640f3b46c24b2717b245143bb696a53108e" Oct 06 08:22:31 crc kubenswrapper[4991]: I1006 08:22:31.482710 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wgkf7"] Oct 06 08:22:31 crc kubenswrapper[4991]: I1006 08:22:31.487786 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wgkf7"] Oct 06 08:22:31 crc kubenswrapper[4991]: I1006 08:22:31.579455 4991 scope.go:117] "RemoveContainer" containerID="a3b89122333a5afba62f4912910ffb332d4015a07e4358dd771fca990c51ecbc" Oct 06 08:22:31 crc kubenswrapper[4991]: I1006 08:22:31.598446 4991 scope.go:117] "RemoveContainer" containerID="e0560346920730128655c04711bcac32c0e20551c236c2da15820d2222682c38" Oct 06 08:22:32 crc kubenswrapper[4991]: I1006 08:22:32.477189 4991 generic.go:334] "Generic (PLEG): container finished" podID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" containerID="84194d80f4581e51ab00fd388f8f7706b3d82c734c44c12062a9319bf6ede90f" exitCode=0 Oct 06 08:22:32 crc kubenswrapper[4991]: I1006 08:22:32.477337 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs6jw" event={"ID":"94d0cbbb-3329-45c8-8c99-f49fc4068d6d","Type":"ContainerDied","Data":"84194d80f4581e51ab00fd388f8f7706b3d82c734c44c12062a9319bf6ede90f"} Oct 06 08:22:33 crc kubenswrapper[4991]: I1006 08:22:33.251873 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2541e35c-acef-49c6-8117-1eaefe92a7b5" path="/var/lib/kubelet/pods/2541e35c-acef-49c6-8117-1eaefe92a7b5/volumes" Oct 06 08:22:33 crc kubenswrapper[4991]: I1006 08:22:33.485076 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bch4t" event={"ID":"3fd33015-5eee-4441-a373-4b062b28fefd","Type":"ContainerStarted","Data":"b058c90a2ccc01d52e2212010726983bc9b3bc8c6d61bca50f78ca01f6e6aee2"} Oct 06 08:22:33 crc kubenswrapper[4991]: I1006 08:22:33.488749 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs6jw" event={"ID":"94d0cbbb-3329-45c8-8c99-f49fc4068d6d","Type":"ContainerStarted","Data":"9eecb659a75b6c0a270f8cc3a7a92438df632d65eaa556ee322a887420e36a49"} Oct 06 08:22:33 crc kubenswrapper[4991]: I1006 08:22:33.527455 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zs6jw" podStartSLOduration=2.57540851 podStartE2EDuration="44.527430082s" podCreationTimestamp="2025-10-06 08:21:49 +0000 UTC" firstStartedPulling="2025-10-06 08:21:50.938203039 +0000 UTC m=+162.675953060" lastFinishedPulling="2025-10-06 08:22:32.890224591 +0000 UTC m=+204.627974632" observedRunningTime="2025-10-06 08:22:33.525019717 +0000 UTC m=+205.262769748" watchObservedRunningTime="2025-10-06 08:22:33.527430082 +0000 UTC m=+205.265180103" Oct 06 08:22:34 crc kubenswrapper[4991]: I1006 08:22:34.495986 4991 generic.go:334] "Generic (PLEG): container finished" podID="3fd33015-5eee-4441-a373-4b062b28fefd" containerID="b058c90a2ccc01d52e2212010726983bc9b3bc8c6d61bca50f78ca01f6e6aee2" exitCode=0 Oct 06 08:22:34 crc kubenswrapper[4991]: I1006 08:22:34.496085 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bch4t" event={"ID":"3fd33015-5eee-4441-a373-4b062b28fefd","Type":"ContainerDied","Data":"b058c90a2ccc01d52e2212010726983bc9b3bc8c6d61bca50f78ca01f6e6aee2"} Oct 06 08:22:35 crc kubenswrapper[4991]: I1006 08:22:35.505230 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl6f9" event={"ID":"6f4da1f1-397f-4cb5-af9d-cb28306486a5","Type":"ContainerStarted","Data":"82f5663e5cfffca6ff8a1bc2bba226d150efa70755da618c8a798daaeab0feb0"} Oct 06 08:22:35 crc kubenswrapper[4991]: I1006 08:22:35.507601 4991 generic.go:334] "Generic (PLEG): container finished" podID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" containerID="8bc87a4331c0338999f4660ad28f16cf045b12897a5f2ab7cb1f798d8d937e5c" exitCode=0 Oct 06 08:22:35 crc kubenswrapper[4991]: I1006 08:22:35.507647 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g55zd" event={"ID":"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b","Type":"ContainerDied","Data":"8bc87a4331c0338999f4660ad28f16cf045b12897a5f2ab7cb1f798d8d937e5c"} Oct 06 08:22:35 crc kubenswrapper[4991]: I1006 08:22:35.510325 4991 generic.go:334] "Generic (PLEG): container finished" podID="741cebf1-af9b-4287-9804-47d3c702882d" containerID="b441def18eef6c4fae41823b8ccbf04903c12563423a40126bac525738117320" exitCode=0 Oct 06 08:22:35 crc kubenswrapper[4991]: I1006 08:22:35.510395 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzsxl" event={"ID":"741cebf1-af9b-4287-9804-47d3c702882d","Type":"ContainerDied","Data":"b441def18eef6c4fae41823b8ccbf04903c12563423a40126bac525738117320"} Oct 06 08:22:35 crc kubenswrapper[4991]: I1006 08:22:35.514723 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bch4t" event={"ID":"3fd33015-5eee-4441-a373-4b062b28fefd","Type":"ContainerStarted","Data":"71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d"} Oct 06 08:22:35 crc kubenswrapper[4991]: I1006 08:22:35.577230 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bch4t" podStartSLOduration=4.459523335 podStartE2EDuration="50.577210981s" podCreationTimestamp="2025-10-06 08:21:45 +0000 UTC" firstStartedPulling="2025-10-06 08:21:48.762179761 +0000 UTC m=+160.499929782" lastFinishedPulling="2025-10-06 08:22:34.879867407 +0000 UTC m=+206.617617428" observedRunningTime="2025-10-06 08:22:35.552208788 +0000 UTC m=+207.289958849" watchObservedRunningTime="2025-10-06 08:22:35.577210981 +0000 UTC m=+207.314960992" Oct 06 08:22:36 crc kubenswrapper[4991]: I1006 08:22:36.365159 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:22:36 crc kubenswrapper[4991]: I1006 08:22:36.365467 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:22:36 crc kubenswrapper[4991]: I1006 08:22:36.523935 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g55zd" event={"ID":"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b","Type":"ContainerStarted","Data":"dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793"} Oct 06 08:22:36 crc kubenswrapper[4991]: I1006 08:22:36.526388 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzsxl" event={"ID":"741cebf1-af9b-4287-9804-47d3c702882d","Type":"ContainerStarted","Data":"95fe1a51bea6ef9f9a4d1b42fc2fd7c41dab482fc0fba2450202e2778fdcf0d1"} Oct 06 08:22:36 crc kubenswrapper[4991]: I1006 08:22:36.528016 4991 generic.go:334] "Generic (PLEG): container finished" podID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" containerID="82f5663e5cfffca6ff8a1bc2bba226d150efa70755da618c8a798daaeab0feb0" exitCode=0 Oct 06 08:22:36 crc kubenswrapper[4991]: I1006 08:22:36.528051 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl6f9" event={"ID":"6f4da1f1-397f-4cb5-af9d-cb28306486a5","Type":"ContainerDied","Data":"82f5663e5cfffca6ff8a1bc2bba226d150efa70755da618c8a798daaeab0feb0"} Oct 06 08:22:36 crc kubenswrapper[4991]: I1006 08:22:36.544557 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g55zd" podStartSLOduration=3.345479609 podStartE2EDuration="51.544526486s" podCreationTimestamp="2025-10-06 08:21:45 +0000 UTC" firstStartedPulling="2025-10-06 08:21:47.725657619 +0000 UTC m=+159.463407640" lastFinishedPulling="2025-10-06 08:22:35.924704496 +0000 UTC m=+207.662454517" observedRunningTime="2025-10-06 08:22:36.541880607 +0000 UTC m=+208.279630628" watchObservedRunningTime="2025-10-06 08:22:36.544526486 +0000 UTC m=+208.282276497" Oct 06 08:22:36 crc kubenswrapper[4991]: I1006 08:22:36.595209 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vzsxl" podStartSLOduration=2.401132946 podStartE2EDuration="48.595189494s" podCreationTimestamp="2025-10-06 08:21:48 +0000 UTC" firstStartedPulling="2025-10-06 08:21:49.845479545 +0000 UTC m=+161.583229566" lastFinishedPulling="2025-10-06 08:22:36.039536093 +0000 UTC m=+207.777286114" observedRunningTime="2025-10-06 08:22:36.592752629 +0000 UTC m=+208.330502660" watchObservedRunningTime="2025-10-06 08:22:36.595189494 +0000 UTC m=+208.332939515" Oct 06 08:22:37 crc kubenswrapper[4991]: I1006 08:22:37.400313 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-bch4t" podUID="3fd33015-5eee-4441-a373-4b062b28fefd" containerName="registry-server" probeResult="failure" output=< Oct 06 08:22:37 crc kubenswrapper[4991]: timeout: failed to connect service ":50051" within 1s Oct 06 08:22:37 crc kubenswrapper[4991]: > Oct 06 08:22:37 crc kubenswrapper[4991]: I1006 08:22:37.536936 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl6f9" event={"ID":"6f4da1f1-397f-4cb5-af9d-cb28306486a5","Type":"ContainerStarted","Data":"5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8"} Oct 06 08:22:37 crc kubenswrapper[4991]: I1006 08:22:37.563350 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hl6f9" podStartSLOduration=3.481756707 podStartE2EDuration="50.563314072s" podCreationTimestamp="2025-10-06 08:21:47 +0000 UTC" firstStartedPulling="2025-10-06 08:21:49.9354571 +0000 UTC m=+161.673207121" lastFinishedPulling="2025-10-06 08:22:37.017014465 +0000 UTC m=+208.754764486" observedRunningTime="2025-10-06 08:22:37.560239935 +0000 UTC m=+209.297989956" watchObservedRunningTime="2025-10-06 08:22:37.563314072 +0000 UTC m=+209.301064093" Oct 06 08:22:38 crc kubenswrapper[4991]: I1006 08:22:38.185024 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:22:38 crc kubenswrapper[4991]: I1006 08:22:38.185408 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:22:38 crc kubenswrapper[4991]: I1006 08:22:38.233427 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:22:38 crc kubenswrapper[4991]: I1006 08:22:38.674995 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:22:38 crc kubenswrapper[4991]: I1006 08:22:38.675739 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:22:38 crc kubenswrapper[4991]: I1006 08:22:38.738356 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:22:39 crc kubenswrapper[4991]: I1006 08:22:39.563667 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:22:39 crc kubenswrapper[4991]: I1006 08:22:39.563734 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:22:39 crc kubenswrapper[4991]: I1006 08:22:39.605948 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:22:39 crc kubenswrapper[4991]: I1006 08:22:39.985558 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vtcb6"] Oct 06 08:22:40 crc kubenswrapper[4991]: I1006 08:22:40.605017 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:22:43 crc kubenswrapper[4991]: I1006 08:22:43.124665 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zs6jw"] Oct 06 08:22:43 crc kubenswrapper[4991]: I1006 08:22:43.125088 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zs6jw" podUID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" containerName="registry-server" containerID="cri-o://9eecb659a75b6c0a270f8cc3a7a92438df632d65eaa556ee322a887420e36a49" gracePeriod=2 Oct 06 08:22:43 crc kubenswrapper[4991]: I1006 08:22:43.569599 4991 generic.go:334] "Generic (PLEG): container finished" podID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" containerID="9eecb659a75b6c0a270f8cc3a7a92438df632d65eaa556ee322a887420e36a49" exitCode=0 Oct 06 08:22:43 crc kubenswrapper[4991]: I1006 08:22:43.569653 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs6jw" event={"ID":"94d0cbbb-3329-45c8-8c99-f49fc4068d6d","Type":"ContainerDied","Data":"9eecb659a75b6c0a270f8cc3a7a92438df632d65eaa556ee322a887420e36a49"} Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.173276 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.209844 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-catalog-content\") pod \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\" (UID: \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\") " Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.210003 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qbwp\" (UniqueName: \"kubernetes.io/projected/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-kube-api-access-5qbwp\") pod \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\" (UID: \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\") " Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.210047 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-utilities\") pod \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\" (UID: \"94d0cbbb-3329-45c8-8c99-f49fc4068d6d\") " Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.210793 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-utilities" (OuterVolumeSpecName: "utilities") pod "94d0cbbb-3329-45c8-8c99-f49fc4068d6d" (UID: "94d0cbbb-3329-45c8-8c99-f49fc4068d6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.216287 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-kube-api-access-5qbwp" (OuterVolumeSpecName: "kube-api-access-5qbwp") pod "94d0cbbb-3329-45c8-8c99-f49fc4068d6d" (UID: "94d0cbbb-3329-45c8-8c99-f49fc4068d6d"). InnerVolumeSpecName "kube-api-access-5qbwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.300570 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94d0cbbb-3329-45c8-8c99-f49fc4068d6d" (UID: "94d0cbbb-3329-45c8-8c99-f49fc4068d6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.311503 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qbwp\" (UniqueName: \"kubernetes.io/projected/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-kube-api-access-5qbwp\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.311538 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.311547 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d0cbbb-3329-45c8-8c99-f49fc4068d6d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.575984 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs6jw" event={"ID":"94d0cbbb-3329-45c8-8c99-f49fc4068d6d","Type":"ContainerDied","Data":"1266b3136ec594b8d4cba72ec3ef30fd9ea9f07c94e40e068f37690c5a41ba81"} Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.576037 4991 scope.go:117] "RemoveContainer" containerID="9eecb659a75b6c0a270f8cc3a7a92438df632d65eaa556ee322a887420e36a49" Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.576095 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs6jw" Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.599101 4991 scope.go:117] "RemoveContainer" containerID="84194d80f4581e51ab00fd388f8f7706b3d82c734c44c12062a9319bf6ede90f" Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.622610 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zs6jw"] Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.629950 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zs6jw"] Oct 06 08:22:44 crc kubenswrapper[4991]: I1006 08:22:44.634493 4991 scope.go:117] "RemoveContainer" containerID="e83054da1dbe0adcf4f07f78fc73e9dec7df87b0f36ca38235b8b77713fd45ef" Oct 06 08:22:45 crc kubenswrapper[4991]: I1006 08:22:45.253512 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" path="/var/lib/kubelet/pods/94d0cbbb-3329-45c8-8c99-f49fc4068d6d/volumes" Oct 06 08:22:45 crc kubenswrapper[4991]: I1006 08:22:45.951986 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:22:45 crc kubenswrapper[4991]: I1006 08:22:45.952044 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:22:46 crc kubenswrapper[4991]: I1006 08:22:46.012220 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:22:46 crc kubenswrapper[4991]: I1006 08:22:46.405676 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:22:46 crc kubenswrapper[4991]: I1006 08:22:46.448911 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:22:46 crc kubenswrapper[4991]: I1006 08:22:46.633656 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:22:46 crc kubenswrapper[4991]: I1006 08:22:46.925696 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bch4t"] Oct 06 08:22:47 crc kubenswrapper[4991]: I1006 08:22:47.591886 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bch4t" podUID="3fd33015-5eee-4441-a373-4b062b28fefd" containerName="registry-server" containerID="cri-o://71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d" gracePeriod=2 Oct 06 08:22:47 crc kubenswrapper[4991]: I1006 08:22:47.953760 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:22:47 crc kubenswrapper[4991]: I1006 08:22:47.972223 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd33015-5eee-4441-a373-4b062b28fefd-catalog-content\") pod \"3fd33015-5eee-4441-a373-4b062b28fefd\" (UID: \"3fd33015-5eee-4441-a373-4b062b28fefd\") " Oct 06 08:22:47 crc kubenswrapper[4991]: I1006 08:22:47.972345 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd33015-5eee-4441-a373-4b062b28fefd-utilities\") pod \"3fd33015-5eee-4441-a373-4b062b28fefd\" (UID: \"3fd33015-5eee-4441-a373-4b062b28fefd\") " Oct 06 08:22:47 crc kubenswrapper[4991]: I1006 08:22:47.972397 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg8fv\" (UniqueName: \"kubernetes.io/projected/3fd33015-5eee-4441-a373-4b062b28fefd-kube-api-access-tg8fv\") pod \"3fd33015-5eee-4441-a373-4b062b28fefd\" (UID: \"3fd33015-5eee-4441-a373-4b062b28fefd\") " Oct 06 08:22:47 crc kubenswrapper[4991]: I1006 08:22:47.973772 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fd33015-5eee-4441-a373-4b062b28fefd-utilities" (OuterVolumeSpecName: "utilities") pod "3fd33015-5eee-4441-a373-4b062b28fefd" (UID: "3fd33015-5eee-4441-a373-4b062b28fefd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:47 crc kubenswrapper[4991]: I1006 08:22:47.979162 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd33015-5eee-4441-a373-4b062b28fefd-kube-api-access-tg8fv" (OuterVolumeSpecName: "kube-api-access-tg8fv") pod "3fd33015-5eee-4441-a373-4b062b28fefd" (UID: "3fd33015-5eee-4441-a373-4b062b28fefd"). InnerVolumeSpecName "kube-api-access-tg8fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.045743 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fd33015-5eee-4441-a373-4b062b28fefd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fd33015-5eee-4441-a373-4b062b28fefd" (UID: "3fd33015-5eee-4441-a373-4b062b28fefd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.073963 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg8fv\" (UniqueName: \"kubernetes.io/projected/3fd33015-5eee-4441-a373-4b062b28fefd-kube-api-access-tg8fv\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.074005 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fd33015-5eee-4441-a373-4b062b28fefd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.074018 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fd33015-5eee-4441-a373-4b062b28fefd-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.238575 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.598552 4991 generic.go:334] "Generic (PLEG): container finished" podID="3fd33015-5eee-4441-a373-4b062b28fefd" containerID="71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d" exitCode=0 Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.598586 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bch4t" event={"ID":"3fd33015-5eee-4441-a373-4b062b28fefd","Type":"ContainerDied","Data":"71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d"} Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.598621 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bch4t" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.599261 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bch4t" event={"ID":"3fd33015-5eee-4441-a373-4b062b28fefd","Type":"ContainerDied","Data":"7634aa47e8d6361de89e455b32844308475a697026b6e89f60ad4e52efb5ecce"} Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.599311 4991 scope.go:117] "RemoveContainer" containerID="71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.617522 4991 scope.go:117] "RemoveContainer" containerID="b058c90a2ccc01d52e2212010726983bc9b3bc8c6d61bca50f78ca01f6e6aee2" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.627118 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bch4t"] Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.632749 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bch4t"] Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.635432 4991 scope.go:117] "RemoveContainer" containerID="9d265bfc34e50ca601495cd0e5f606e6de854ce1f4c5257563a47d92c3b49e2f" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.655074 4991 scope.go:117] "RemoveContainer" containerID="71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d" Oct 06 08:22:48 crc kubenswrapper[4991]: E1006 08:22:48.655694 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d\": container with ID starting with 71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d not found: ID does not exist" containerID="71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.655734 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d"} err="failed to get container status \"71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d\": rpc error: code = NotFound desc = could not find container \"71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d\": container with ID starting with 71fd978ac7682eaa6f2cc650403cf183ff5f115b78007446aa5b065896cd0f5d not found: ID does not exist" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.655782 4991 scope.go:117] "RemoveContainer" containerID="b058c90a2ccc01d52e2212010726983bc9b3bc8c6d61bca50f78ca01f6e6aee2" Oct 06 08:22:48 crc kubenswrapper[4991]: E1006 08:22:48.656224 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b058c90a2ccc01d52e2212010726983bc9b3bc8c6d61bca50f78ca01f6e6aee2\": container with ID starting with b058c90a2ccc01d52e2212010726983bc9b3bc8c6d61bca50f78ca01f6e6aee2 not found: ID does not exist" containerID="b058c90a2ccc01d52e2212010726983bc9b3bc8c6d61bca50f78ca01f6e6aee2" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.656258 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b058c90a2ccc01d52e2212010726983bc9b3bc8c6d61bca50f78ca01f6e6aee2"} err="failed to get container status \"b058c90a2ccc01d52e2212010726983bc9b3bc8c6d61bca50f78ca01f6e6aee2\": rpc error: code = NotFound desc = could not find container \"b058c90a2ccc01d52e2212010726983bc9b3bc8c6d61bca50f78ca01f6e6aee2\": container with ID starting with b058c90a2ccc01d52e2212010726983bc9b3bc8c6d61bca50f78ca01f6e6aee2 not found: ID does not exist" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.656309 4991 scope.go:117] "RemoveContainer" containerID="9d265bfc34e50ca601495cd0e5f606e6de854ce1f4c5257563a47d92c3b49e2f" Oct 06 08:22:48 crc kubenswrapper[4991]: E1006 08:22:48.656727 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d265bfc34e50ca601495cd0e5f606e6de854ce1f4c5257563a47d92c3b49e2f\": container with ID starting with 9d265bfc34e50ca601495cd0e5f606e6de854ce1f4c5257563a47d92c3b49e2f not found: ID does not exist" containerID="9d265bfc34e50ca601495cd0e5f606e6de854ce1f4c5257563a47d92c3b49e2f" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.656761 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d265bfc34e50ca601495cd0e5f606e6de854ce1f4c5257563a47d92c3b49e2f"} err="failed to get container status \"9d265bfc34e50ca601495cd0e5f606e6de854ce1f4c5257563a47d92c3b49e2f\": rpc error: code = NotFound desc = could not find container \"9d265bfc34e50ca601495cd0e5f606e6de854ce1f4c5257563a47d92c3b49e2f\": container with ID starting with 9d265bfc34e50ca601495cd0e5f606e6de854ce1f4c5257563a47d92c3b49e2f not found: ID does not exist" Oct 06 08:22:48 crc kubenswrapper[4991]: I1006 08:22:48.719953 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:22:49 crc kubenswrapper[4991]: I1006 08:22:49.250109 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd33015-5eee-4441-a373-4b062b28fefd" path="/var/lib/kubelet/pods/3fd33015-5eee-4441-a373-4b062b28fefd/volumes" Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.328952 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzsxl"] Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.330750 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vzsxl" podUID="741cebf1-af9b-4287-9804-47d3c702882d" containerName="registry-server" containerID="cri-o://95fe1a51bea6ef9f9a4d1b42fc2fd7c41dab482fc0fba2450202e2778fdcf0d1" gracePeriod=2 Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.620367 4991 generic.go:334] "Generic (PLEG): container finished" podID="741cebf1-af9b-4287-9804-47d3c702882d" containerID="95fe1a51bea6ef9f9a4d1b42fc2fd7c41dab482fc0fba2450202e2778fdcf0d1" exitCode=0 Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.620416 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzsxl" event={"ID":"741cebf1-af9b-4287-9804-47d3c702882d","Type":"ContainerDied","Data":"95fe1a51bea6ef9f9a4d1b42fc2fd7c41dab482fc0fba2450202e2778fdcf0d1"} Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.708314 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.836831 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741cebf1-af9b-4287-9804-47d3c702882d-utilities\") pod \"741cebf1-af9b-4287-9804-47d3c702882d\" (UID: \"741cebf1-af9b-4287-9804-47d3c702882d\") " Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.836874 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741cebf1-af9b-4287-9804-47d3c702882d-catalog-content\") pod \"741cebf1-af9b-4287-9804-47d3c702882d\" (UID: \"741cebf1-af9b-4287-9804-47d3c702882d\") " Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.836906 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxqrm\" (UniqueName: \"kubernetes.io/projected/741cebf1-af9b-4287-9804-47d3c702882d-kube-api-access-dxqrm\") pod \"741cebf1-af9b-4287-9804-47d3c702882d\" (UID: \"741cebf1-af9b-4287-9804-47d3c702882d\") " Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.837830 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/741cebf1-af9b-4287-9804-47d3c702882d-utilities" (OuterVolumeSpecName: "utilities") pod "741cebf1-af9b-4287-9804-47d3c702882d" (UID: "741cebf1-af9b-4287-9804-47d3c702882d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.845147 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/741cebf1-af9b-4287-9804-47d3c702882d-kube-api-access-dxqrm" (OuterVolumeSpecName: "kube-api-access-dxqrm") pod "741cebf1-af9b-4287-9804-47d3c702882d" (UID: "741cebf1-af9b-4287-9804-47d3c702882d"). InnerVolumeSpecName "kube-api-access-dxqrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.851540 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/741cebf1-af9b-4287-9804-47d3c702882d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "741cebf1-af9b-4287-9804-47d3c702882d" (UID: "741cebf1-af9b-4287-9804-47d3c702882d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.938184 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/741cebf1-af9b-4287-9804-47d3c702882d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.938233 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/741cebf1-af9b-4287-9804-47d3c702882d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:51 crc kubenswrapper[4991]: I1006 08:22:51.938246 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxqrm\" (UniqueName: \"kubernetes.io/projected/741cebf1-af9b-4287-9804-47d3c702882d-kube-api-access-dxqrm\") on node \"crc\" DevicePath \"\"" Oct 06 08:22:52 crc kubenswrapper[4991]: I1006 08:22:52.634501 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzsxl" event={"ID":"741cebf1-af9b-4287-9804-47d3c702882d","Type":"ContainerDied","Data":"6746254f7a38d4766db244216060b8cbec2d169bfca0a6d154f924cb2728e714"} Oct 06 08:22:52 crc kubenswrapper[4991]: I1006 08:22:52.634557 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vzsxl" Oct 06 08:22:52 crc kubenswrapper[4991]: I1006 08:22:52.634571 4991 scope.go:117] "RemoveContainer" containerID="95fe1a51bea6ef9f9a4d1b42fc2fd7c41dab482fc0fba2450202e2778fdcf0d1" Oct 06 08:22:52 crc kubenswrapper[4991]: I1006 08:22:52.649634 4991 scope.go:117] "RemoveContainer" containerID="b441def18eef6c4fae41823b8ccbf04903c12563423a40126bac525738117320" Oct 06 08:22:52 crc kubenswrapper[4991]: I1006 08:22:52.673760 4991 scope.go:117] "RemoveContainer" containerID="601c6534660abfb8f033417713520e07372b18a95ca983d897d3ccfbb1d01fbd" Oct 06 08:22:52 crc kubenswrapper[4991]: I1006 08:22:52.675398 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzsxl"] Oct 06 08:22:52 crc kubenswrapper[4991]: I1006 08:22:52.697411 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzsxl"] Oct 06 08:22:53 crc kubenswrapper[4991]: I1006 08:22:53.255745 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="741cebf1-af9b-4287-9804-47d3c702882d" path="/var/lib/kubelet/pods/741cebf1-af9b-4287-9804-47d3c702882d/volumes" Oct 06 08:22:57 crc kubenswrapper[4991]: I1006 08:22:57.529435 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:22:57 crc kubenswrapper[4991]: I1006 08:22:57.529898 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:22:57 crc kubenswrapper[4991]: I1006 08:22:57.529998 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:22:57 crc kubenswrapper[4991]: I1006 08:22:57.531083 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c"} pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:22:57 crc kubenswrapper[4991]: I1006 08:22:57.531554 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" containerID="cri-o://b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c" gracePeriod=600 Oct 06 08:22:57 crc kubenswrapper[4991]: I1006 08:22:57.672033 4991 generic.go:334] "Generic (PLEG): container finished" podID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerID="b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c" exitCode=0 Oct 06 08:22:57 crc kubenswrapper[4991]: I1006 08:22:57.672117 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerDied","Data":"b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c"} Oct 06 08:22:58 crc kubenswrapper[4991]: I1006 08:22:58.680939 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"9395c4bf8dda68ef7b021048ac5697dbf4d4e81b3af0b2f1dc5c5f35a3034cc5"} Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.013495 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" podUID="ac30cd53-f61e-4f56-8110-4eacc0aade3f" containerName="oauth-openshift" containerID="cri-o://c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09" gracePeriod=15 Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.498273 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544264 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-router-certs\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544454 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-idp-0-file-data\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544493 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-service-ca\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544526 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-cliconfig\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544562 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-error\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544596 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-login\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544643 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-ocp-branding-template\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544679 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac30cd53-f61e-4f56-8110-4eacc0aade3f-audit-dir\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544710 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-audit-policies\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544732 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-serving-cert\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544758 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-trusted-ca-bundle\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544782 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-provider-selection\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544816 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss4zd\" (UniqueName: \"kubernetes.io/projected/ac30cd53-f61e-4f56-8110-4eacc0aade3f-kube-api-access-ss4zd\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.544849 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-session\") pod \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\" (UID: \"ac30cd53-f61e-4f56-8110-4eacc0aade3f\") " Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.545556 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac30cd53-f61e-4f56-8110-4eacc0aade3f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.546132 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.548130 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.549240 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.549651 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.573134 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt"] Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.574654 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.575773 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.575999 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576070 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e532e0b-f8e0-4f4e-a42f-22f944b9814f" containerName="pruner" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.576098 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e532e0b-f8e0-4f4e-a42f-22f944b9814f" containerName="pruner" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576117 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" containerName="extract-content" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.576128 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" containerName="extract-content" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576238 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd33015-5eee-4441-a373-4b062b28fefd" containerName="extract-content" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.576258 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd33015-5eee-4441-a373-4b062b28fefd" containerName="extract-content" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576272 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317a739a-78c6-4089-8bc2-a8e3a0388522" containerName="pruner" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.576282 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="317a739a-78c6-4089-8bc2-a8e3a0388522" containerName="pruner" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576319 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2541e35c-acef-49c6-8117-1eaefe92a7b5" containerName="extract-utilities" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.576330 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2541e35c-acef-49c6-8117-1eaefe92a7b5" containerName="extract-utilities" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576347 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741cebf1-af9b-4287-9804-47d3c702882d" containerName="registry-server" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.576357 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="741cebf1-af9b-4287-9804-47d3c702882d" containerName="registry-server" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576371 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2541e35c-acef-49c6-8117-1eaefe92a7b5" containerName="registry-server" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.576381 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2541e35c-acef-49c6-8117-1eaefe92a7b5" containerName="registry-server" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576403 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2541e35c-acef-49c6-8117-1eaefe92a7b5" containerName="extract-content" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.576413 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2541e35c-acef-49c6-8117-1eaefe92a7b5" containerName="extract-content" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576426 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" containerName="registry-server" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.576436 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" containerName="registry-server" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576448 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac30cd53-f61e-4f56-8110-4eacc0aade3f" containerName="oauth-openshift" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.576458 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac30cd53-f61e-4f56-8110-4eacc0aade3f" containerName="oauth-openshift" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576476 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741cebf1-af9b-4287-9804-47d3c702882d" containerName="extract-utilities" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.576487 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="741cebf1-af9b-4287-9804-47d3c702882d" containerName="extract-utilities" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576498 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd33015-5eee-4441-a373-4b062b28fefd" containerName="extract-utilities" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.576508 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd33015-5eee-4441-a373-4b062b28fefd" containerName="extract-utilities" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.576524 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" containerName="extract-utilities" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.579197 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" containerName="extract-utilities" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.579225 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd33015-5eee-4441-a373-4b062b28fefd" containerName="registry-server" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.579234 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd33015-5eee-4441-a373-4b062b28fefd" containerName="registry-server" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.579248 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741cebf1-af9b-4287-9804-47d3c702882d" containerName="extract-content" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.579257 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="741cebf1-af9b-4287-9804-47d3c702882d" containerName="extract-content" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.579466 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d0cbbb-3329-45c8-8c99-f49fc4068d6d" containerName="registry-server" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.579482 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd33015-5eee-4441-a373-4b062b28fefd" containerName="registry-server" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.579492 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac30cd53-f61e-4f56-8110-4eacc0aade3f" containerName="oauth-openshift" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.579505 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="741cebf1-af9b-4287-9804-47d3c702882d" containerName="registry-server" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.579514 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e532e0b-f8e0-4f4e-a42f-22f944b9814f" containerName="pruner" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.579526 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="2541e35c-acef-49c6-8117-1eaefe92a7b5" containerName="registry-server" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.579544 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="317a739a-78c6-4089-8bc2-a8e3a0388522" containerName="pruner" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.579855 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.579964 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt"] Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.580060 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.582850 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac30cd53-f61e-4f56-8110-4eacc0aade3f-kube-api-access-ss4zd" (OuterVolumeSpecName: "kube-api-access-ss4zd") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "kube-api-access-ss4zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.585542 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.585917 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.586232 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.587923 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ac30cd53-f61e-4f56-8110-4eacc0aade3f" (UID: "ac30cd53-f61e-4f56-8110-4eacc0aade3f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645548 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645608 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-user-template-login\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645643 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645673 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-router-certs\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645698 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-user-template-error\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645725 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645749 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-service-ca\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645771 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645794 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/11965d55-9de4-4eeb-bc4b-fad880fecfa1-audit-dir\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645821 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-session\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645858 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/11965d55-9de4-4eeb-bc4b-fad880fecfa1-audit-policies\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645882 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-987s8\" (UniqueName: \"kubernetes.io/projected/11965d55-9de4-4eeb-bc4b-fad880fecfa1-kube-api-access-987s8\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645910 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645943 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.645994 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646013 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646027 4991 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac30cd53-f61e-4f56-8110-4eacc0aade3f-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646041 4991 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646053 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646068 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646082 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646096 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss4zd\" (UniqueName: \"kubernetes.io/projected/ac30cd53-f61e-4f56-8110-4eacc0aade3f-kube-api-access-ss4zd\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646108 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646121 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646134 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646150 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646163 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.646179 4991 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac30cd53-f61e-4f56-8110-4eacc0aade3f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.728710 4991 generic.go:334] "Generic (PLEG): container finished" podID="ac30cd53-f61e-4f56-8110-4eacc0aade3f" containerID="c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09" exitCode=0 Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.728804 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" event={"ID":"ac30cd53-f61e-4f56-8110-4eacc0aade3f","Type":"ContainerDied","Data":"c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09"} Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.728889 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" event={"ID":"ac30cd53-f61e-4f56-8110-4eacc0aade3f","Type":"ContainerDied","Data":"08a9a50b6787abdabc4bdff51c2e277727550a5a8ee202a13a9096d828c78b89"} Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.728923 4991 scope.go:117] "RemoveContainer" containerID="c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.729605 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-vtcb6" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.747717 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/11965d55-9de4-4eeb-bc4b-fad880fecfa1-audit-policies\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.747772 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-987s8\" (UniqueName: \"kubernetes.io/projected/11965d55-9de4-4eeb-bc4b-fad880fecfa1-kube-api-access-987s8\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.747800 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.747830 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.747856 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.747880 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-user-template-login\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.747905 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.747929 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-router-certs\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.747949 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-user-template-error\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.747975 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.748112 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-service-ca\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.748854 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.748885 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/11965d55-9de4-4eeb-bc4b-fad880fecfa1-audit-policies\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.748936 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.749049 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/11965d55-9de4-4eeb-bc4b-fad880fecfa1-audit-dir\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.749068 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-session\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.749514 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-service-ca\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.749569 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/11965d55-9de4-4eeb-bc4b-fad880fecfa1-audit-dir\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.750931 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.753844 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-session\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.754060 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-user-template-login\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.754219 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.754583 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.754678 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.755667 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.756332 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-system-router-certs\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.756666 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/11965d55-9de4-4eeb-bc4b-fad880fecfa1-v4-0-config-user-template-error\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.757830 4991 scope.go:117] "RemoveContainer" containerID="c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09" Oct 06 08:23:05 crc kubenswrapper[4991]: E1006 08:23:05.758529 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09\": container with ID starting with c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09 not found: ID does not exist" containerID="c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.758596 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09"} err="failed to get container status \"c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09\": rpc error: code = NotFound desc = could not find container \"c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09\": container with ID starting with c08081f433f7acaef6a9c2d3daa7247ab66d77a54c3d6e4391346e863b477d09 not found: ID does not exist" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.769579 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-987s8\" (UniqueName: \"kubernetes.io/projected/11965d55-9de4-4eeb-bc4b-fad880fecfa1-kube-api-access-987s8\") pod \"oauth-openshift-b6fcd9dcb-kv6pt\" (UID: \"11965d55-9de4-4eeb-bc4b-fad880fecfa1\") " pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.783255 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vtcb6"] Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.787870 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-vtcb6"] Oct 06 08:23:05 crc kubenswrapper[4991]: I1006 08:23:05.920928 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:06 crc kubenswrapper[4991]: I1006 08:23:06.119999 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt"] Oct 06 08:23:06 crc kubenswrapper[4991]: I1006 08:23:06.736982 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" event={"ID":"11965d55-9de4-4eeb-bc4b-fad880fecfa1","Type":"ContainerStarted","Data":"b8845db2bcd098319d6e1f09d39fdc1bcfda2aaaad021eb1e6e1204444cb036e"} Oct 06 08:23:06 crc kubenswrapper[4991]: I1006 08:23:06.737610 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" event={"ID":"11965d55-9de4-4eeb-bc4b-fad880fecfa1","Type":"ContainerStarted","Data":"fd93efc801fd59f006d5e37985b787af10ebabbd046568a3b6ed128cd854a1df"} Oct 06 08:23:06 crc kubenswrapper[4991]: I1006 08:23:06.737639 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:06 crc kubenswrapper[4991]: I1006 08:23:06.760680 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" podStartSLOduration=26.760653596 podStartE2EDuration="26.760653596s" podCreationTimestamp="2025-10-06 08:22:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:23:06.758260983 +0000 UTC m=+238.496011004" watchObservedRunningTime="2025-10-06 08:23:06.760653596 +0000 UTC m=+238.498403657" Oct 06 08:23:06 crc kubenswrapper[4991]: I1006 08:23:06.888006 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-b6fcd9dcb-kv6pt" Oct 06 08:23:07 crc kubenswrapper[4991]: I1006 08:23:07.254842 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac30cd53-f61e-4f56-8110-4eacc0aade3f" path="/var/lib/kubelet/pods/ac30cd53-f61e-4f56-8110-4eacc0aade3f/volumes" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.153764 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g55zd"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.154551 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g55zd" podUID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" containerName="registry-server" containerID="cri-o://dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793" gracePeriod=30 Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.165419 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fqgbm"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.166418 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fqgbm" podUID="632906da-50f0-468a-aac9-cb2aea39d813" containerName="registry-server" containerID="cri-o://19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236" gracePeriod=30 Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.173601 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p5tk4"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.174010 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" podUID="0a7333dc-b6d2-4513-8574-a95446be656b" containerName="marketplace-operator" containerID="cri-o://658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2" gracePeriod=30 Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.186750 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl6f9"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.187014 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hl6f9" podUID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" containerName="registry-server" containerID="cri-o://5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8" gracePeriod=30 Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.199396 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r8pwh"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.199922 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r8pwh" podUID="ee1292d9-c828-4aa7-819b-015bcc128d0b" containerName="registry-server" containerID="cri-o://d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d" gracePeriod=30 Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.202124 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nqf5k"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.202795 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.220678 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nqf5k"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.236256 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4058fb1d-9049-488e-bf00-25f59b04c065-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nqf5k\" (UID: \"4058fb1d-9049-488e-bf00-25f59b04c065\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.236358 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4058fb1d-9049-488e-bf00-25f59b04c065-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nqf5k\" (UID: \"4058fb1d-9049-488e-bf00-25f59b04c065\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.236381 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwnt2\" (UniqueName: \"kubernetes.io/projected/4058fb1d-9049-488e-bf00-25f59b04c065-kube-api-access-jwnt2\") pod \"marketplace-operator-79b997595-nqf5k\" (UID: \"4058fb1d-9049-488e-bf00-25f59b04c065\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.337467 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4058fb1d-9049-488e-bf00-25f59b04c065-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nqf5k\" (UID: \"4058fb1d-9049-488e-bf00-25f59b04c065\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.337814 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4058fb1d-9049-488e-bf00-25f59b04c065-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nqf5k\" (UID: \"4058fb1d-9049-488e-bf00-25f59b04c065\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.337853 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwnt2\" (UniqueName: \"kubernetes.io/projected/4058fb1d-9049-488e-bf00-25f59b04c065-kube-api-access-jwnt2\") pod \"marketplace-operator-79b997595-nqf5k\" (UID: \"4058fb1d-9049-488e-bf00-25f59b04c065\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.344538 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4058fb1d-9049-488e-bf00-25f59b04c065-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nqf5k\" (UID: \"4058fb1d-9049-488e-bf00-25f59b04c065\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.347620 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4058fb1d-9049-488e-bf00-25f59b04c065-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nqf5k\" (UID: \"4058fb1d-9049-488e-bf00-25f59b04c065\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.364071 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwnt2\" (UniqueName: \"kubernetes.io/projected/4058fb1d-9049-488e-bf00-25f59b04c065-kube-api-access-jwnt2\") pod \"marketplace-operator-79b997595-nqf5k\" (UID: \"4058fb1d-9049-488e-bf00-25f59b04c065\") " pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.576153 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.592376 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.595157 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.596387 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.599983 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.605485 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743307 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee1292d9-c828-4aa7-819b-015bcc128d0b-catalog-content\") pod \"ee1292d9-c828-4aa7-819b-015bcc128d0b\" (UID: \"ee1292d9-c828-4aa7-819b-015bcc128d0b\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743342 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-catalog-content\") pod \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\" (UID: \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743373 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/632906da-50f0-468a-aac9-cb2aea39d813-utilities\") pod \"632906da-50f0-468a-aac9-cb2aea39d813\" (UID: \"632906da-50f0-468a-aac9-cb2aea39d813\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743392 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rnbs\" (UniqueName: \"kubernetes.io/projected/0a7333dc-b6d2-4513-8574-a95446be656b-kube-api-access-4rnbs\") pod \"0a7333dc-b6d2-4513-8574-a95446be656b\" (UID: \"0a7333dc-b6d2-4513-8574-a95446be656b\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743411 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/632906da-50f0-468a-aac9-cb2aea39d813-catalog-content\") pod \"632906da-50f0-468a-aac9-cb2aea39d813\" (UID: \"632906da-50f0-468a-aac9-cb2aea39d813\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743425 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt2nr\" (UniqueName: \"kubernetes.io/projected/ee1292d9-c828-4aa7-819b-015bcc128d0b-kube-api-access-wt2nr\") pod \"ee1292d9-c828-4aa7-819b-015bcc128d0b\" (UID: \"ee1292d9-c828-4aa7-819b-015bcc128d0b\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743456 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9rcr\" (UniqueName: \"kubernetes.io/projected/632906da-50f0-468a-aac9-cb2aea39d813-kube-api-access-n9rcr\") pod \"632906da-50f0-468a-aac9-cb2aea39d813\" (UID: \"632906da-50f0-468a-aac9-cb2aea39d813\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743476 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a7333dc-b6d2-4513-8574-a95446be656b-marketplace-operator-metrics\") pod \"0a7333dc-b6d2-4513-8574-a95446be656b\" (UID: \"0a7333dc-b6d2-4513-8574-a95446be656b\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743495 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-utilities\") pod \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\" (UID: \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743525 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f4da1f1-397f-4cb5-af9d-cb28306486a5-catalog-content\") pod \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\" (UID: \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743545 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5tnz\" (UniqueName: \"kubernetes.io/projected/6f4da1f1-397f-4cb5-af9d-cb28306486a5-kube-api-access-x5tnz\") pod \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\" (UID: \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743563 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srrtd\" (UniqueName: \"kubernetes.io/projected/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-kube-api-access-srrtd\") pod \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\" (UID: \"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743581 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a7333dc-b6d2-4513-8574-a95446be656b-marketplace-trusted-ca\") pod \"0a7333dc-b6d2-4513-8574-a95446be656b\" (UID: \"0a7333dc-b6d2-4513-8574-a95446be656b\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743597 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f4da1f1-397f-4cb5-af9d-cb28306486a5-utilities\") pod \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\" (UID: \"6f4da1f1-397f-4cb5-af9d-cb28306486a5\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.743623 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee1292d9-c828-4aa7-819b-015bcc128d0b-utilities\") pod \"ee1292d9-c828-4aa7-819b-015bcc128d0b\" (UID: \"ee1292d9-c828-4aa7-819b-015bcc128d0b\") " Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.745147 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-utilities" (OuterVolumeSpecName: "utilities") pod "7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" (UID: "7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.747801 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1292d9-c828-4aa7-819b-015bcc128d0b-utilities" (OuterVolumeSpecName: "utilities") pod "ee1292d9-c828-4aa7-819b-015bcc128d0b" (UID: "ee1292d9-c828-4aa7-819b-015bcc128d0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.748522 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a7333dc-b6d2-4513-8574-a95446be656b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0a7333dc-b6d2-4513-8574-a95446be656b" (UID: "0a7333dc-b6d2-4513-8574-a95446be656b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.748095 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/632906da-50f0-468a-aac9-cb2aea39d813-utilities" (OuterVolumeSpecName: "utilities") pod "632906da-50f0-468a-aac9-cb2aea39d813" (UID: "632906da-50f0-468a-aac9-cb2aea39d813"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.748208 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632906da-50f0-468a-aac9-cb2aea39d813-kube-api-access-n9rcr" (OuterVolumeSpecName: "kube-api-access-n9rcr") pod "632906da-50f0-468a-aac9-cb2aea39d813" (UID: "632906da-50f0-468a-aac9-cb2aea39d813"). InnerVolumeSpecName "kube-api-access-n9rcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.748393 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f4da1f1-397f-4cb5-af9d-cb28306486a5-utilities" (OuterVolumeSpecName: "utilities") pod "6f4da1f1-397f-4cb5-af9d-cb28306486a5" (UID: "6f4da1f1-397f-4cb5-af9d-cb28306486a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.748593 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7333dc-b6d2-4513-8574-a95446be656b-kube-api-access-4rnbs" (OuterVolumeSpecName: "kube-api-access-4rnbs") pod "0a7333dc-b6d2-4513-8574-a95446be656b" (UID: "0a7333dc-b6d2-4513-8574-a95446be656b"). InnerVolumeSpecName "kube-api-access-4rnbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.750451 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a7333dc-b6d2-4513-8574-a95446be656b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0a7333dc-b6d2-4513-8574-a95446be656b" (UID: "0a7333dc-b6d2-4513-8574-a95446be656b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.752360 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-kube-api-access-srrtd" (OuterVolumeSpecName: "kube-api-access-srrtd") pod "7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" (UID: "7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b"). InnerVolumeSpecName "kube-api-access-srrtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.753585 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4da1f1-397f-4cb5-af9d-cb28306486a5-kube-api-access-x5tnz" (OuterVolumeSpecName: "kube-api-access-x5tnz") pod "6f4da1f1-397f-4cb5-af9d-cb28306486a5" (UID: "6f4da1f1-397f-4cb5-af9d-cb28306486a5"). InnerVolumeSpecName "kube-api-access-x5tnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.756939 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1292d9-c828-4aa7-819b-015bcc128d0b-kube-api-access-wt2nr" (OuterVolumeSpecName: "kube-api-access-wt2nr") pod "ee1292d9-c828-4aa7-819b-015bcc128d0b" (UID: "ee1292d9-c828-4aa7-819b-015bcc128d0b"). InnerVolumeSpecName "kube-api-access-wt2nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.771317 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f4da1f1-397f-4cb5-af9d-cb28306486a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f4da1f1-397f-4cb5-af9d-cb28306486a5" (UID: "6f4da1f1-397f-4cb5-af9d-cb28306486a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.823753 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/632906da-50f0-468a-aac9-cb2aea39d813-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "632906da-50f0-468a-aac9-cb2aea39d813" (UID: "632906da-50f0-468a-aac9-cb2aea39d813"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.824140 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" (UID: "7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844776 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844812 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/632906da-50f0-468a-aac9-cb2aea39d813-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844824 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rnbs\" (UniqueName: \"kubernetes.io/projected/0a7333dc-b6d2-4513-8574-a95446be656b-kube-api-access-4rnbs\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844835 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/632906da-50f0-468a-aac9-cb2aea39d813-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844845 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt2nr\" (UniqueName: \"kubernetes.io/projected/ee1292d9-c828-4aa7-819b-015bcc128d0b-kube-api-access-wt2nr\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844853 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9rcr\" (UniqueName: \"kubernetes.io/projected/632906da-50f0-468a-aac9-cb2aea39d813-kube-api-access-n9rcr\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844862 4991 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a7333dc-b6d2-4513-8574-a95446be656b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844872 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844880 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f4da1f1-397f-4cb5-af9d-cb28306486a5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844889 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5tnz\" (UniqueName: \"kubernetes.io/projected/6f4da1f1-397f-4cb5-af9d-cb28306486a5-kube-api-access-x5tnz\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844897 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srrtd\" (UniqueName: \"kubernetes.io/projected/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b-kube-api-access-srrtd\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844905 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f4da1f1-397f-4cb5-af9d-cb28306486a5-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844913 4991 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a7333dc-b6d2-4513-8574-a95446be656b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.844921 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee1292d9-c828-4aa7-819b-015bcc128d0b-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.847273 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1292d9-c828-4aa7-819b-015bcc128d0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee1292d9-c828-4aa7-819b-015bcc128d0b" (UID: "ee1292d9-c828-4aa7-819b-015bcc128d0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.850261 4991 generic.go:334] "Generic (PLEG): container finished" podID="ee1292d9-c828-4aa7-819b-015bcc128d0b" containerID="d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d" exitCode=0 Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.850339 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8pwh" event={"ID":"ee1292d9-c828-4aa7-819b-015bcc128d0b","Type":"ContainerDied","Data":"d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d"} Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.850377 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8pwh" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.850383 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8pwh" event={"ID":"ee1292d9-c828-4aa7-819b-015bcc128d0b","Type":"ContainerDied","Data":"a97c699a5419a3e3c5996827c633045520ebe75f3e11a30ade9425d654a5c5d4"} Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.850396 4991 scope.go:117] "RemoveContainer" containerID="d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.852757 4991 generic.go:334] "Generic (PLEG): container finished" podID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" containerID="5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8" exitCode=0 Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.852814 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hl6f9" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.852829 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl6f9" event={"ID":"6f4da1f1-397f-4cb5-af9d-cb28306486a5","Type":"ContainerDied","Data":"5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8"} Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.852862 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hl6f9" event={"ID":"6f4da1f1-397f-4cb5-af9d-cb28306486a5","Type":"ContainerDied","Data":"9f68343f6c4da1c4eeb427d2a26b3c54c6c6a8d75ba00e3dbb9ce6235bf91efc"} Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.857268 4991 generic.go:334] "Generic (PLEG): container finished" podID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" containerID="dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793" exitCode=0 Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.857309 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g55zd" event={"ID":"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b","Type":"ContainerDied","Data":"dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793"} Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.857405 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g55zd" event={"ID":"7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b","Type":"ContainerDied","Data":"c5b7a66fe50c53bb868142fe7e6d5d230378501e61148b4ec803341259f3c792"} Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.857454 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g55zd" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.869424 4991 generic.go:334] "Generic (PLEG): container finished" podID="0a7333dc-b6d2-4513-8574-a95446be656b" containerID="658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2" exitCode=0 Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.869511 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.869563 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" event={"ID":"0a7333dc-b6d2-4513-8574-a95446be656b","Type":"ContainerDied","Data":"658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2"} Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.869613 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p5tk4" event={"ID":"0a7333dc-b6d2-4513-8574-a95446be656b","Type":"ContainerDied","Data":"30f04d0eac527ef4689ecdc50654f18fb1b7ed928b5a15280082a80724283473"} Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.874371 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqgbm" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.874435 4991 generic.go:334] "Generic (PLEG): container finished" podID="632906da-50f0-468a-aac9-cb2aea39d813" containerID="19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236" exitCode=0 Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.874592 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqgbm" event={"ID":"632906da-50f0-468a-aac9-cb2aea39d813","Type":"ContainerDied","Data":"19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236"} Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.874703 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqgbm" event={"ID":"632906da-50f0-468a-aac9-cb2aea39d813","Type":"ContainerDied","Data":"71beaaa2b540b438c48c7dcd6d76b4b634ae35cc2abb8e1f25eccae335d0d8ad"} Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.881959 4991 scope.go:117] "RemoveContainer" containerID="66da844d3b9f1ba2a7d2b1ca57555b4525d675a206fd9dd8e7dca77a389a858c" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.885399 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl6f9"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.901660 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hl6f9"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.908610 4991 scope.go:117] "RemoveContainer" containerID="ea46516d66194dd4d896dbf340b5d68689a97c03bc1d034f8d1338c64932d085" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.917744 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r8pwh"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.923236 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r8pwh"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.929267 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g55zd"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.932151 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g55zd"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.939160 4991 scope.go:117] "RemoveContainer" containerID="d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d" Oct 06 08:23:26 crc kubenswrapper[4991]: E1006 08:23:26.939549 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d\": container with ID starting with d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d not found: ID does not exist" containerID="d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.939579 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d"} err="failed to get container status \"d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d\": rpc error: code = NotFound desc = could not find container \"d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d\": container with ID starting with d35cff7a6f1c2730ec86b73755bd7f5f9a09faddaca608764623192a18751a9d not found: ID does not exist" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.939600 4991 scope.go:117] "RemoveContainer" containerID="66da844d3b9f1ba2a7d2b1ca57555b4525d675a206fd9dd8e7dca77a389a858c" Oct 06 08:23:26 crc kubenswrapper[4991]: E1006 08:23:26.940005 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66da844d3b9f1ba2a7d2b1ca57555b4525d675a206fd9dd8e7dca77a389a858c\": container with ID starting with 66da844d3b9f1ba2a7d2b1ca57555b4525d675a206fd9dd8e7dca77a389a858c not found: ID does not exist" containerID="66da844d3b9f1ba2a7d2b1ca57555b4525d675a206fd9dd8e7dca77a389a858c" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.940071 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66da844d3b9f1ba2a7d2b1ca57555b4525d675a206fd9dd8e7dca77a389a858c"} err="failed to get container status \"66da844d3b9f1ba2a7d2b1ca57555b4525d675a206fd9dd8e7dca77a389a858c\": rpc error: code = NotFound desc = could not find container \"66da844d3b9f1ba2a7d2b1ca57555b4525d675a206fd9dd8e7dca77a389a858c\": container with ID starting with 66da844d3b9f1ba2a7d2b1ca57555b4525d675a206fd9dd8e7dca77a389a858c not found: ID does not exist" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.940099 4991 scope.go:117] "RemoveContainer" containerID="ea46516d66194dd4d896dbf340b5d68689a97c03bc1d034f8d1338c64932d085" Oct 06 08:23:26 crc kubenswrapper[4991]: E1006 08:23:26.940420 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea46516d66194dd4d896dbf340b5d68689a97c03bc1d034f8d1338c64932d085\": container with ID starting with ea46516d66194dd4d896dbf340b5d68689a97c03bc1d034f8d1338c64932d085 not found: ID does not exist" containerID="ea46516d66194dd4d896dbf340b5d68689a97c03bc1d034f8d1338c64932d085" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.940461 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea46516d66194dd4d896dbf340b5d68689a97c03bc1d034f8d1338c64932d085"} err="failed to get container status \"ea46516d66194dd4d896dbf340b5d68689a97c03bc1d034f8d1338c64932d085\": rpc error: code = NotFound desc = could not find container \"ea46516d66194dd4d896dbf340b5d68689a97c03bc1d034f8d1338c64932d085\": container with ID starting with ea46516d66194dd4d896dbf340b5d68689a97c03bc1d034f8d1338c64932d085 not found: ID does not exist" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.940483 4991 scope.go:117] "RemoveContainer" containerID="5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.946067 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee1292d9-c828-4aa7-819b-015bcc128d0b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.947742 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p5tk4"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.953127 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p5tk4"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.956714 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fqgbm"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.959010 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fqgbm"] Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.962352 4991 scope.go:117] "RemoveContainer" containerID="82f5663e5cfffca6ff8a1bc2bba226d150efa70755da618c8a798daaeab0feb0" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.972996 4991 scope.go:117] "RemoveContainer" containerID="ad8e9fa68d2363fe06f239a176357af6c4b3ce973c8f022dea9d0f7ea4791b80" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.985955 4991 scope.go:117] "RemoveContainer" containerID="5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8" Oct 06 08:23:26 crc kubenswrapper[4991]: E1006 08:23:26.986286 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8\": container with ID starting with 5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8 not found: ID does not exist" containerID="5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.986344 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8"} err="failed to get container status \"5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8\": rpc error: code = NotFound desc = could not find container \"5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8\": container with ID starting with 5712d65622b4cc38d0e9fcfad3f8d895b6618e83cc666f6b26749704270877e8 not found: ID does not exist" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.986377 4991 scope.go:117] "RemoveContainer" containerID="82f5663e5cfffca6ff8a1bc2bba226d150efa70755da618c8a798daaeab0feb0" Oct 06 08:23:26 crc kubenswrapper[4991]: E1006 08:23:26.986651 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82f5663e5cfffca6ff8a1bc2bba226d150efa70755da618c8a798daaeab0feb0\": container with ID starting with 82f5663e5cfffca6ff8a1bc2bba226d150efa70755da618c8a798daaeab0feb0 not found: ID does not exist" containerID="82f5663e5cfffca6ff8a1bc2bba226d150efa70755da618c8a798daaeab0feb0" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.986690 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f5663e5cfffca6ff8a1bc2bba226d150efa70755da618c8a798daaeab0feb0"} err="failed to get container status \"82f5663e5cfffca6ff8a1bc2bba226d150efa70755da618c8a798daaeab0feb0\": rpc error: code = NotFound desc = could not find container \"82f5663e5cfffca6ff8a1bc2bba226d150efa70755da618c8a798daaeab0feb0\": container with ID starting with 82f5663e5cfffca6ff8a1bc2bba226d150efa70755da618c8a798daaeab0feb0 not found: ID does not exist" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.986719 4991 scope.go:117] "RemoveContainer" containerID="ad8e9fa68d2363fe06f239a176357af6c4b3ce973c8f022dea9d0f7ea4791b80" Oct 06 08:23:26 crc kubenswrapper[4991]: E1006 08:23:26.987020 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad8e9fa68d2363fe06f239a176357af6c4b3ce973c8f022dea9d0f7ea4791b80\": container with ID starting with ad8e9fa68d2363fe06f239a176357af6c4b3ce973c8f022dea9d0f7ea4791b80 not found: ID does not exist" containerID="ad8e9fa68d2363fe06f239a176357af6c4b3ce973c8f022dea9d0f7ea4791b80" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.987046 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad8e9fa68d2363fe06f239a176357af6c4b3ce973c8f022dea9d0f7ea4791b80"} err="failed to get container status \"ad8e9fa68d2363fe06f239a176357af6c4b3ce973c8f022dea9d0f7ea4791b80\": rpc error: code = NotFound desc = could not find container \"ad8e9fa68d2363fe06f239a176357af6c4b3ce973c8f022dea9d0f7ea4791b80\": container with ID starting with ad8e9fa68d2363fe06f239a176357af6c4b3ce973c8f022dea9d0f7ea4791b80 not found: ID does not exist" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.987062 4991 scope.go:117] "RemoveContainer" containerID="dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793" Oct 06 08:23:26 crc kubenswrapper[4991]: I1006 08:23:26.997942 4991 scope.go:117] "RemoveContainer" containerID="8bc87a4331c0338999f4660ad28f16cf045b12897a5f2ab7cb1f798d8d937e5c" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.010453 4991 scope.go:117] "RemoveContainer" containerID="32727a035cfea73e851ddc4b30fc09462ed5b2d87315f5bf6d3aada6308f93c4" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.018268 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nqf5k"] Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.029777 4991 scope.go:117] "RemoveContainer" containerID="dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793" Oct 06 08:23:27 crc kubenswrapper[4991]: E1006 08:23:27.030117 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793\": container with ID starting with dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793 not found: ID does not exist" containerID="dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.030153 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793"} err="failed to get container status \"dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793\": rpc error: code = NotFound desc = could not find container \"dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793\": container with ID starting with dae6a1fa00157828c3d60435a9e07ee58af9e841f32047e6832951f01e8f7793 not found: ID does not exist" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.030183 4991 scope.go:117] "RemoveContainer" containerID="8bc87a4331c0338999f4660ad28f16cf045b12897a5f2ab7cb1f798d8d937e5c" Oct 06 08:23:27 crc kubenswrapper[4991]: E1006 08:23:27.030806 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc87a4331c0338999f4660ad28f16cf045b12897a5f2ab7cb1f798d8d937e5c\": container with ID starting with 8bc87a4331c0338999f4660ad28f16cf045b12897a5f2ab7cb1f798d8d937e5c not found: ID does not exist" containerID="8bc87a4331c0338999f4660ad28f16cf045b12897a5f2ab7cb1f798d8d937e5c" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.030831 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc87a4331c0338999f4660ad28f16cf045b12897a5f2ab7cb1f798d8d937e5c"} err="failed to get container status \"8bc87a4331c0338999f4660ad28f16cf045b12897a5f2ab7cb1f798d8d937e5c\": rpc error: code = NotFound desc = could not find container \"8bc87a4331c0338999f4660ad28f16cf045b12897a5f2ab7cb1f798d8d937e5c\": container with ID starting with 8bc87a4331c0338999f4660ad28f16cf045b12897a5f2ab7cb1f798d8d937e5c not found: ID does not exist" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.030848 4991 scope.go:117] "RemoveContainer" containerID="32727a035cfea73e851ddc4b30fc09462ed5b2d87315f5bf6d3aada6308f93c4" Oct 06 08:23:27 crc kubenswrapper[4991]: E1006 08:23:27.031121 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32727a035cfea73e851ddc4b30fc09462ed5b2d87315f5bf6d3aada6308f93c4\": container with ID starting with 32727a035cfea73e851ddc4b30fc09462ed5b2d87315f5bf6d3aada6308f93c4 not found: ID does not exist" containerID="32727a035cfea73e851ddc4b30fc09462ed5b2d87315f5bf6d3aada6308f93c4" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.031150 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32727a035cfea73e851ddc4b30fc09462ed5b2d87315f5bf6d3aada6308f93c4"} err="failed to get container status \"32727a035cfea73e851ddc4b30fc09462ed5b2d87315f5bf6d3aada6308f93c4\": rpc error: code = NotFound desc = could not find container \"32727a035cfea73e851ddc4b30fc09462ed5b2d87315f5bf6d3aada6308f93c4\": container with ID starting with 32727a035cfea73e851ddc4b30fc09462ed5b2d87315f5bf6d3aada6308f93c4 not found: ID does not exist" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.031171 4991 scope.go:117] "RemoveContainer" containerID="658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.055702 4991 scope.go:117] "RemoveContainer" containerID="658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2" Oct 06 08:23:27 crc kubenswrapper[4991]: E1006 08:23:27.057883 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2\": container with ID starting with 658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2 not found: ID does not exist" containerID="658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.057931 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2"} err="failed to get container status \"658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2\": rpc error: code = NotFound desc = could not find container \"658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2\": container with ID starting with 658d5a3583455c188e83a314a097cbd28e2c1fc3c152ffb127e3c2f9e0aff1c2 not found: ID does not exist" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.057961 4991 scope.go:117] "RemoveContainer" containerID="19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.077425 4991 scope.go:117] "RemoveContainer" containerID="ca43580d823c71ccbd431f1621618e9336b5dd6334cb4439d1e5919f6701e5c1" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.093354 4991 scope.go:117] "RemoveContainer" containerID="f1e6de9ab89ffb3ae16f0e9aed788c0dd68c232181be4c2988f314719b11b52c" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.111002 4991 scope.go:117] "RemoveContainer" containerID="19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236" Oct 06 08:23:27 crc kubenswrapper[4991]: E1006 08:23:27.112973 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236\": container with ID starting with 19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236 not found: ID does not exist" containerID="19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.113099 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236"} err="failed to get container status \"19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236\": rpc error: code = NotFound desc = could not find container \"19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236\": container with ID starting with 19967f47df237bfc7bd4a3b15f6c8c99165d6c4b6bfa2568ad0ecc9faf882236 not found: ID does not exist" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.113216 4991 scope.go:117] "RemoveContainer" containerID="ca43580d823c71ccbd431f1621618e9336b5dd6334cb4439d1e5919f6701e5c1" Oct 06 08:23:27 crc kubenswrapper[4991]: E1006 08:23:27.113799 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca43580d823c71ccbd431f1621618e9336b5dd6334cb4439d1e5919f6701e5c1\": container with ID starting with ca43580d823c71ccbd431f1621618e9336b5dd6334cb4439d1e5919f6701e5c1 not found: ID does not exist" containerID="ca43580d823c71ccbd431f1621618e9336b5dd6334cb4439d1e5919f6701e5c1" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.113867 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca43580d823c71ccbd431f1621618e9336b5dd6334cb4439d1e5919f6701e5c1"} err="failed to get container status \"ca43580d823c71ccbd431f1621618e9336b5dd6334cb4439d1e5919f6701e5c1\": rpc error: code = NotFound desc = could not find container \"ca43580d823c71ccbd431f1621618e9336b5dd6334cb4439d1e5919f6701e5c1\": container with ID starting with ca43580d823c71ccbd431f1621618e9336b5dd6334cb4439d1e5919f6701e5c1 not found: ID does not exist" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.113920 4991 scope.go:117] "RemoveContainer" containerID="f1e6de9ab89ffb3ae16f0e9aed788c0dd68c232181be4c2988f314719b11b52c" Oct 06 08:23:27 crc kubenswrapper[4991]: E1006 08:23:27.114229 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1e6de9ab89ffb3ae16f0e9aed788c0dd68c232181be4c2988f314719b11b52c\": container with ID starting with f1e6de9ab89ffb3ae16f0e9aed788c0dd68c232181be4c2988f314719b11b52c not found: ID does not exist" containerID="f1e6de9ab89ffb3ae16f0e9aed788c0dd68c232181be4c2988f314719b11b52c" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.114378 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1e6de9ab89ffb3ae16f0e9aed788c0dd68c232181be4c2988f314719b11b52c"} err="failed to get container status \"f1e6de9ab89ffb3ae16f0e9aed788c0dd68c232181be4c2988f314719b11b52c\": rpc error: code = NotFound desc = could not find container \"f1e6de9ab89ffb3ae16f0e9aed788c0dd68c232181be4c2988f314719b11b52c\": container with ID starting with f1e6de9ab89ffb3ae16f0e9aed788c0dd68c232181be4c2988f314719b11b52c not found: ID does not exist" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.251263 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a7333dc-b6d2-4513-8574-a95446be656b" path="/var/lib/kubelet/pods/0a7333dc-b6d2-4513-8574-a95446be656b/volumes" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.252090 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="632906da-50f0-468a-aac9-cb2aea39d813" path="/var/lib/kubelet/pods/632906da-50f0-468a-aac9-cb2aea39d813/volumes" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.252838 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" path="/var/lib/kubelet/pods/6f4da1f1-397f-4cb5-af9d-cb28306486a5/volumes" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.254495 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" path="/var/lib/kubelet/pods/7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b/volumes" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.255600 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee1292d9-c828-4aa7-819b-015bcc128d0b" path="/var/lib/kubelet/pods/ee1292d9-c828-4aa7-819b-015bcc128d0b/volumes" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.888319 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" event={"ID":"4058fb1d-9049-488e-bf00-25f59b04c065","Type":"ContainerStarted","Data":"f3b742adcdadbdc3645ba0d7cdfcaef901d755b1a169570104b8df2535fecf73"} Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.888630 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.888642 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" event={"ID":"4058fb1d-9049-488e-bf00-25f59b04c065","Type":"ContainerStarted","Data":"9f319c023aa993c28125eb2a17fab5da8406b72f2c2807c341058bc99796931e"} Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.891151 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" Oct 06 08:23:27 crc kubenswrapper[4991]: I1006 08:23:27.906066 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nqf5k" podStartSLOduration=1.906043307 podStartE2EDuration="1.906043307s" podCreationTimestamp="2025-10-06 08:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:23:27.905896593 +0000 UTC m=+259.643646604" watchObservedRunningTime="2025-10-06 08:23:27.906043307 +0000 UTC m=+259.643793328" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374329 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l9kb2"] Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374549 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" containerName="extract-utilities" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374567 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" containerName="extract-utilities" Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374583 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" containerName="extract-content" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374593 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" containerName="extract-content" Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374602 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1292d9-c828-4aa7-819b-015bcc128d0b" containerName="registry-server" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374611 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1292d9-c828-4aa7-819b-015bcc128d0b" containerName="registry-server" Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374628 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1292d9-c828-4aa7-819b-015bcc128d0b" containerName="extract-utilities" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374635 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1292d9-c828-4aa7-819b-015bcc128d0b" containerName="extract-utilities" Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374645 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632906da-50f0-468a-aac9-cb2aea39d813" containerName="extract-utilities" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374653 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="632906da-50f0-468a-aac9-cb2aea39d813" containerName="extract-utilities" Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374662 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1292d9-c828-4aa7-819b-015bcc128d0b" containerName="extract-content" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374669 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1292d9-c828-4aa7-819b-015bcc128d0b" containerName="extract-content" Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374679 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632906da-50f0-468a-aac9-cb2aea39d813" containerName="registry-server" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374688 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="632906da-50f0-468a-aac9-cb2aea39d813" containerName="registry-server" Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374696 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7333dc-b6d2-4513-8574-a95446be656b" containerName="marketplace-operator" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374704 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7333dc-b6d2-4513-8574-a95446be656b" containerName="marketplace-operator" Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374716 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" containerName="extract-utilities" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374724 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" containerName="extract-utilities" Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374733 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" containerName="extract-content" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374741 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" containerName="extract-content" Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374751 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632906da-50f0-468a-aac9-cb2aea39d813" containerName="extract-content" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374758 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="632906da-50f0-468a-aac9-cb2aea39d813" containerName="extract-content" Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374768 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" containerName="registry-server" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374775 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" containerName="registry-server" Oct 06 08:23:28 crc kubenswrapper[4991]: E1006 08:23:28.374788 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" containerName="registry-server" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374795 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" containerName="registry-server" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374902 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1292d9-c828-4aa7-819b-015bcc128d0b" containerName="registry-server" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374920 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4da1f1-397f-4cb5-af9d-cb28306486a5" containerName="registry-server" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374930 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="632906da-50f0-468a-aac9-cb2aea39d813" containerName="registry-server" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374943 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7789ce2e-bc68-4e0d-a04f-3d90cfd5b11b" containerName="registry-server" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.374953 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a7333dc-b6d2-4513-8574-a95446be656b" containerName="marketplace-operator" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.375779 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.378608 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.388102 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9kb2"] Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.565041 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938e0ad9-f781-4d0c-be52-67939a233f2f-utilities\") pod \"redhat-marketplace-l9kb2\" (UID: \"938e0ad9-f781-4d0c-be52-67939a233f2f\") " pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.565201 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938e0ad9-f781-4d0c-be52-67939a233f2f-catalog-content\") pod \"redhat-marketplace-l9kb2\" (UID: \"938e0ad9-f781-4d0c-be52-67939a233f2f\") " pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.565332 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p8gc\" (UniqueName: \"kubernetes.io/projected/938e0ad9-f781-4d0c-be52-67939a233f2f-kube-api-access-9p8gc\") pod \"redhat-marketplace-l9kb2\" (UID: \"938e0ad9-f781-4d0c-be52-67939a233f2f\") " pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.579567 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p9vs2"] Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.583185 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.589515 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.592724 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p9vs2"] Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.666375 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938e0ad9-f781-4d0c-be52-67939a233f2f-utilities\") pod \"redhat-marketplace-l9kb2\" (UID: \"938e0ad9-f781-4d0c-be52-67939a233f2f\") " pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.666482 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938e0ad9-f781-4d0c-be52-67939a233f2f-catalog-content\") pod \"redhat-marketplace-l9kb2\" (UID: \"938e0ad9-f781-4d0c-be52-67939a233f2f\") " pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.667008 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/938e0ad9-f781-4d0c-be52-67939a233f2f-catalog-content\") pod \"redhat-marketplace-l9kb2\" (UID: \"938e0ad9-f781-4d0c-be52-67939a233f2f\") " pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.667010 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/938e0ad9-f781-4d0c-be52-67939a233f2f-utilities\") pod \"redhat-marketplace-l9kb2\" (UID: \"938e0ad9-f781-4d0c-be52-67939a233f2f\") " pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.666521 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p8gc\" (UniqueName: \"kubernetes.io/projected/938e0ad9-f781-4d0c-be52-67939a233f2f-kube-api-access-9p8gc\") pod \"redhat-marketplace-l9kb2\" (UID: \"938e0ad9-f781-4d0c-be52-67939a233f2f\") " pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.687124 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p8gc\" (UniqueName: \"kubernetes.io/projected/938e0ad9-f781-4d0c-be52-67939a233f2f-kube-api-access-9p8gc\") pod \"redhat-marketplace-l9kb2\" (UID: \"938e0ad9-f781-4d0c-be52-67939a233f2f\") " pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.694814 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.768732 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/094a7c3a-f150-42f1-bc2b-5e53b2565058-catalog-content\") pod \"redhat-operators-p9vs2\" (UID: \"094a7c3a-f150-42f1-bc2b-5e53b2565058\") " pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.768777 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlfvz\" (UniqueName: \"kubernetes.io/projected/094a7c3a-f150-42f1-bc2b-5e53b2565058-kube-api-access-nlfvz\") pod \"redhat-operators-p9vs2\" (UID: \"094a7c3a-f150-42f1-bc2b-5e53b2565058\") " pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.768809 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/094a7c3a-f150-42f1-bc2b-5e53b2565058-utilities\") pod \"redhat-operators-p9vs2\" (UID: \"094a7c3a-f150-42f1-bc2b-5e53b2565058\") " pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.870581 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/094a7c3a-f150-42f1-bc2b-5e53b2565058-catalog-content\") pod \"redhat-operators-p9vs2\" (UID: \"094a7c3a-f150-42f1-bc2b-5e53b2565058\") " pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.870915 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlfvz\" (UniqueName: \"kubernetes.io/projected/094a7c3a-f150-42f1-bc2b-5e53b2565058-kube-api-access-nlfvz\") pod \"redhat-operators-p9vs2\" (UID: \"094a7c3a-f150-42f1-bc2b-5e53b2565058\") " pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.870949 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/094a7c3a-f150-42f1-bc2b-5e53b2565058-utilities\") pod \"redhat-operators-p9vs2\" (UID: \"094a7c3a-f150-42f1-bc2b-5e53b2565058\") " pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.871509 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/094a7c3a-f150-42f1-bc2b-5e53b2565058-catalog-content\") pod \"redhat-operators-p9vs2\" (UID: \"094a7c3a-f150-42f1-bc2b-5e53b2565058\") " pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.871588 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/094a7c3a-f150-42f1-bc2b-5e53b2565058-utilities\") pod \"redhat-operators-p9vs2\" (UID: \"094a7c3a-f150-42f1-bc2b-5e53b2565058\") " pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.873161 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l9kb2"] Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.894739 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlfvz\" (UniqueName: \"kubernetes.io/projected/094a7c3a-f150-42f1-bc2b-5e53b2565058-kube-api-access-nlfvz\") pod \"redhat-operators-p9vs2\" (UID: \"094a7c3a-f150-42f1-bc2b-5e53b2565058\") " pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.903186 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9kb2" event={"ID":"938e0ad9-f781-4d0c-be52-67939a233f2f","Type":"ContainerStarted","Data":"2ca0613c8f4d4ff4abcc47c635610d740c789b925bdaa0557b4ace0c0a4f1886"} Oct 06 08:23:28 crc kubenswrapper[4991]: I1006 08:23:28.903491 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:29 crc kubenswrapper[4991]: I1006 08:23:29.299364 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p9vs2"] Oct 06 08:23:29 crc kubenswrapper[4991]: W1006 08:23:29.305999 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod094a7c3a_f150_42f1_bc2b_5e53b2565058.slice/crio-f1cbcfdb6b1d7a38be04edcc7864b50f3ea9b5c3a059ee51c7d68769c9dd02e4 WatchSource:0}: Error finding container f1cbcfdb6b1d7a38be04edcc7864b50f3ea9b5c3a059ee51c7d68769c9dd02e4: Status 404 returned error can't find the container with id f1cbcfdb6b1d7a38be04edcc7864b50f3ea9b5c3a059ee51c7d68769c9dd02e4 Oct 06 08:23:29 crc kubenswrapper[4991]: I1006 08:23:29.909550 4991 generic.go:334] "Generic (PLEG): container finished" podID="094a7c3a-f150-42f1-bc2b-5e53b2565058" containerID="aabea56873cf2a74df693dfaee1547dd1bd864ba45326efd64100c46016a8b17" exitCode=0 Oct 06 08:23:29 crc kubenswrapper[4991]: I1006 08:23:29.909627 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p9vs2" event={"ID":"094a7c3a-f150-42f1-bc2b-5e53b2565058","Type":"ContainerDied","Data":"aabea56873cf2a74df693dfaee1547dd1bd864ba45326efd64100c46016a8b17"} Oct 06 08:23:29 crc kubenswrapper[4991]: I1006 08:23:29.910280 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p9vs2" event={"ID":"094a7c3a-f150-42f1-bc2b-5e53b2565058","Type":"ContainerStarted","Data":"f1cbcfdb6b1d7a38be04edcc7864b50f3ea9b5c3a059ee51c7d68769c9dd02e4"} Oct 06 08:23:29 crc kubenswrapper[4991]: I1006 08:23:29.912786 4991 generic.go:334] "Generic (PLEG): container finished" podID="938e0ad9-f781-4d0c-be52-67939a233f2f" containerID="e188d01c21d50a234f1bcf111c5001dc21a0c3cb19d9213a26d25e872e0eee50" exitCode=0 Oct 06 08:23:29 crc kubenswrapper[4991]: I1006 08:23:29.912888 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9kb2" event={"ID":"938e0ad9-f781-4d0c-be52-67939a233f2f","Type":"ContainerDied","Data":"e188d01c21d50a234f1bcf111c5001dc21a0c3cb19d9213a26d25e872e0eee50"} Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.772920 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jzqrc"] Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.774206 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.776243 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.785401 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jzqrc"] Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.893921 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e91b878-dd79-4d4f-8e3c-8ef2cea97e04-catalog-content\") pod \"certified-operators-jzqrc\" (UID: \"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04\") " pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.894006 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e91b878-dd79-4d4f-8e3c-8ef2cea97e04-utilities\") pod \"certified-operators-jzqrc\" (UID: \"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04\") " pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.894327 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9z6q\" (UniqueName: \"kubernetes.io/projected/7e91b878-dd79-4d4f-8e3c-8ef2cea97e04-kube-api-access-d9z6q\") pod \"certified-operators-jzqrc\" (UID: \"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04\") " pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.919735 4991 generic.go:334] "Generic (PLEG): container finished" podID="938e0ad9-f781-4d0c-be52-67939a233f2f" containerID="bad1d2152121334d4803bdab18a4b6b1689e5be3a5d332e021d753b18ea99ee1" exitCode=0 Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.919810 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9kb2" event={"ID":"938e0ad9-f781-4d0c-be52-67939a233f2f","Type":"ContainerDied","Data":"bad1d2152121334d4803bdab18a4b6b1689e5be3a5d332e021d753b18ea99ee1"} Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.923525 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p9vs2" event={"ID":"094a7c3a-f150-42f1-bc2b-5e53b2565058","Type":"ContainerStarted","Data":"308a56d13234bc541d2fa939a7a4eb48d475bd2870fa72ee977ecd3c54c3786c"} Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.977579 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hfklb"] Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.981616 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.982841 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hfklb"] Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.983840 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.995058 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e91b878-dd79-4d4f-8e3c-8ef2cea97e04-catalog-content\") pod \"certified-operators-jzqrc\" (UID: \"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04\") " pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.995111 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e91b878-dd79-4d4f-8e3c-8ef2cea97e04-utilities\") pod \"certified-operators-jzqrc\" (UID: \"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04\") " pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.995168 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9z6q\" (UniqueName: \"kubernetes.io/projected/7e91b878-dd79-4d4f-8e3c-8ef2cea97e04-kube-api-access-d9z6q\") pod \"certified-operators-jzqrc\" (UID: \"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04\") " pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.995531 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e91b878-dd79-4d4f-8e3c-8ef2cea97e04-catalog-content\") pod \"certified-operators-jzqrc\" (UID: \"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04\") " pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:30 crc kubenswrapper[4991]: I1006 08:23:30.995623 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e91b878-dd79-4d4f-8e3c-8ef2cea97e04-utilities\") pod \"certified-operators-jzqrc\" (UID: \"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04\") " pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.020153 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9z6q\" (UniqueName: \"kubernetes.io/projected/7e91b878-dd79-4d4f-8e3c-8ef2cea97e04-kube-api-access-d9z6q\") pod \"certified-operators-jzqrc\" (UID: \"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04\") " pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.096216 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be81a987-5591-4f8a-ae8c-1fda1597892e-utilities\") pod \"community-operators-hfklb\" (UID: \"be81a987-5591-4f8a-ae8c-1fda1597892e\") " pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.096278 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be81a987-5591-4f8a-ae8c-1fda1597892e-catalog-content\") pod \"community-operators-hfklb\" (UID: \"be81a987-5591-4f8a-ae8c-1fda1597892e\") " pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.096378 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s97k4\" (UniqueName: \"kubernetes.io/projected/be81a987-5591-4f8a-ae8c-1fda1597892e-kube-api-access-s97k4\") pod \"community-operators-hfklb\" (UID: \"be81a987-5591-4f8a-ae8c-1fda1597892e\") " pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.142453 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.197635 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be81a987-5591-4f8a-ae8c-1fda1597892e-utilities\") pod \"community-operators-hfklb\" (UID: \"be81a987-5591-4f8a-ae8c-1fda1597892e\") " pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.197687 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be81a987-5591-4f8a-ae8c-1fda1597892e-catalog-content\") pod \"community-operators-hfklb\" (UID: \"be81a987-5591-4f8a-ae8c-1fda1597892e\") " pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.197724 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s97k4\" (UniqueName: \"kubernetes.io/projected/be81a987-5591-4f8a-ae8c-1fda1597892e-kube-api-access-s97k4\") pod \"community-operators-hfklb\" (UID: \"be81a987-5591-4f8a-ae8c-1fda1597892e\") " pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.198109 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be81a987-5591-4f8a-ae8c-1fda1597892e-utilities\") pod \"community-operators-hfklb\" (UID: \"be81a987-5591-4f8a-ae8c-1fda1597892e\") " pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.198197 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be81a987-5591-4f8a-ae8c-1fda1597892e-catalog-content\") pod \"community-operators-hfklb\" (UID: \"be81a987-5591-4f8a-ae8c-1fda1597892e\") " pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.217607 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s97k4\" (UniqueName: \"kubernetes.io/projected/be81a987-5591-4f8a-ae8c-1fda1597892e-kube-api-access-s97k4\") pod \"community-operators-hfklb\" (UID: \"be81a987-5591-4f8a-ae8c-1fda1597892e\") " pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.296010 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.455968 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hfklb"] Oct 06 08:23:31 crc kubenswrapper[4991]: W1006 08:23:31.458943 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe81a987_5591_4f8a_ae8c_1fda1597892e.slice/crio-32801ec65145b999eb0047767f809b2385b6a72493e3bc6d886a0c8330722066 WatchSource:0}: Error finding container 32801ec65145b999eb0047767f809b2385b6a72493e3bc6d886a0c8330722066: Status 404 returned error can't find the container with id 32801ec65145b999eb0047767f809b2385b6a72493e3bc6d886a0c8330722066 Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.536378 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jzqrc"] Oct 06 08:23:31 crc kubenswrapper[4991]: W1006 08:23:31.545490 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e91b878_dd79_4d4f_8e3c_8ef2cea97e04.slice/crio-626430de847853e3bd7d4fde999b0e6c01b360faa6525176b8eea8ce27c8228b WatchSource:0}: Error finding container 626430de847853e3bd7d4fde999b0e6c01b360faa6525176b8eea8ce27c8228b: Status 404 returned error can't find the container with id 626430de847853e3bd7d4fde999b0e6c01b360faa6525176b8eea8ce27c8228b Oct 06 08:23:31 crc kubenswrapper[4991]: E1006 08:23:31.777970 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e91b878_dd79_4d4f_8e3c_8ef2cea97e04.slice/crio-c59df355e5ee62b766e1ffbc1d591275bf219402415f3fc41850ba826078ef1b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e91b878_dd79_4d4f_8e3c_8ef2cea97e04.slice/crio-conmon-c59df355e5ee62b766e1ffbc1d591275bf219402415f3fc41850ba826078ef1b.scope\": RecentStats: unable to find data in memory cache]" Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.930065 4991 generic.go:334] "Generic (PLEG): container finished" podID="094a7c3a-f150-42f1-bc2b-5e53b2565058" containerID="308a56d13234bc541d2fa939a7a4eb48d475bd2870fa72ee977ecd3c54c3786c" exitCode=0 Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.930114 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p9vs2" event={"ID":"094a7c3a-f150-42f1-bc2b-5e53b2565058","Type":"ContainerDied","Data":"308a56d13234bc541d2fa939a7a4eb48d475bd2870fa72ee977ecd3c54c3786c"} Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.935080 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l9kb2" event={"ID":"938e0ad9-f781-4d0c-be52-67939a233f2f","Type":"ContainerStarted","Data":"bffcd93702452cd18154a7758c8c4fca6346a598ea6fc26e300e53c8823eeffe"} Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.938443 4991 generic.go:334] "Generic (PLEG): container finished" podID="7e91b878-dd79-4d4f-8e3c-8ef2cea97e04" containerID="c59df355e5ee62b766e1ffbc1d591275bf219402415f3fc41850ba826078ef1b" exitCode=0 Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.938514 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzqrc" event={"ID":"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04","Type":"ContainerDied","Data":"c59df355e5ee62b766e1ffbc1d591275bf219402415f3fc41850ba826078ef1b"} Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.938548 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzqrc" event={"ID":"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04","Type":"ContainerStarted","Data":"626430de847853e3bd7d4fde999b0e6c01b360faa6525176b8eea8ce27c8228b"} Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.944565 4991 generic.go:334] "Generic (PLEG): container finished" podID="be81a987-5591-4f8a-ae8c-1fda1597892e" containerID="db31814d68e55858c160075cdb59b1d7124c7f361f0ee36f39142944590a806c" exitCode=0 Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.944617 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfklb" event={"ID":"be81a987-5591-4f8a-ae8c-1fda1597892e","Type":"ContainerDied","Data":"db31814d68e55858c160075cdb59b1d7124c7f361f0ee36f39142944590a806c"} Oct 06 08:23:31 crc kubenswrapper[4991]: I1006 08:23:31.944649 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfklb" event={"ID":"be81a987-5591-4f8a-ae8c-1fda1597892e","Type":"ContainerStarted","Data":"32801ec65145b999eb0047767f809b2385b6a72493e3bc6d886a0c8330722066"} Oct 06 08:23:32 crc kubenswrapper[4991]: I1006 08:23:32.001460 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l9kb2" podStartSLOduration=2.456045583 podStartE2EDuration="4.00143557s" podCreationTimestamp="2025-10-06 08:23:28 +0000 UTC" firstStartedPulling="2025-10-06 08:23:29.914520727 +0000 UTC m=+261.652270738" lastFinishedPulling="2025-10-06 08:23:31.459910694 +0000 UTC m=+263.197660725" observedRunningTime="2025-10-06 08:23:32.000630995 +0000 UTC m=+263.738381036" watchObservedRunningTime="2025-10-06 08:23:32.00143557 +0000 UTC m=+263.739185591" Oct 06 08:23:32 crc kubenswrapper[4991]: I1006 08:23:32.952678 4991 generic.go:334] "Generic (PLEG): container finished" podID="7e91b878-dd79-4d4f-8e3c-8ef2cea97e04" containerID="c745c2acbd5a3111c1f96b2c4c46081fb244caf23e94422bf72e21ffb98d7127" exitCode=0 Oct 06 08:23:32 crc kubenswrapper[4991]: I1006 08:23:32.953177 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzqrc" event={"ID":"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04","Type":"ContainerDied","Data":"c745c2acbd5a3111c1f96b2c4c46081fb244caf23e94422bf72e21ffb98d7127"} Oct 06 08:23:32 crc kubenswrapper[4991]: I1006 08:23:32.955283 4991 generic.go:334] "Generic (PLEG): container finished" podID="be81a987-5591-4f8a-ae8c-1fda1597892e" containerID="6250e5e36325cecbd308c8ab50a8332821a4628eb25a368f3612696228dc0169" exitCode=0 Oct 06 08:23:32 crc kubenswrapper[4991]: I1006 08:23:32.955372 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfklb" event={"ID":"be81a987-5591-4f8a-ae8c-1fda1597892e","Type":"ContainerDied","Data":"6250e5e36325cecbd308c8ab50a8332821a4628eb25a368f3612696228dc0169"} Oct 06 08:23:32 crc kubenswrapper[4991]: I1006 08:23:32.957724 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p9vs2" event={"ID":"094a7c3a-f150-42f1-bc2b-5e53b2565058","Type":"ContainerStarted","Data":"22637d334f2fed24be1d75965a85cdd95b61f72cbd1e9c9718ef7200525bd0c2"} Oct 06 08:23:33 crc kubenswrapper[4991]: I1006 08:23:33.001007 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p9vs2" podStartSLOduration=2.488150735 podStartE2EDuration="5.000983418s" podCreationTimestamp="2025-10-06 08:23:28 +0000 UTC" firstStartedPulling="2025-10-06 08:23:29.911291818 +0000 UTC m=+261.649041839" lastFinishedPulling="2025-10-06 08:23:32.424124501 +0000 UTC m=+264.161874522" observedRunningTime="2025-10-06 08:23:32.99778105 +0000 UTC m=+264.735531081" watchObservedRunningTime="2025-10-06 08:23:33.000983418 +0000 UTC m=+264.738733449" Oct 06 08:23:33 crc kubenswrapper[4991]: I1006 08:23:33.965086 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzqrc" event={"ID":"7e91b878-dd79-4d4f-8e3c-8ef2cea97e04","Type":"ContainerStarted","Data":"1c47939cbeb04f34c0cffeabe49def32ff505d82724a5256950185ce783a1def"} Oct 06 08:23:33 crc kubenswrapper[4991]: I1006 08:23:33.967706 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfklb" event={"ID":"be81a987-5591-4f8a-ae8c-1fda1597892e","Type":"ContainerStarted","Data":"cf173037e6ff8a6f041e73fd87fcd975c4ce79fa9f8b1e6334c58a5d56dbb195"} Oct 06 08:23:33 crc kubenswrapper[4991]: I1006 08:23:33.989458 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jzqrc" podStartSLOduration=2.484131647 podStartE2EDuration="3.989437187s" podCreationTimestamp="2025-10-06 08:23:30 +0000 UTC" firstStartedPulling="2025-10-06 08:23:31.939922758 +0000 UTC m=+263.677672779" lastFinishedPulling="2025-10-06 08:23:33.445228298 +0000 UTC m=+265.182978319" observedRunningTime="2025-10-06 08:23:33.986075233 +0000 UTC m=+265.723825274" watchObservedRunningTime="2025-10-06 08:23:33.989437187 +0000 UTC m=+265.727187208" Oct 06 08:23:34 crc kubenswrapper[4991]: I1006 08:23:34.003605 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hfklb" podStartSLOduration=2.546305748 podStartE2EDuration="4.003584309s" podCreationTimestamp="2025-10-06 08:23:30 +0000 UTC" firstStartedPulling="2025-10-06 08:23:31.946738076 +0000 UTC m=+263.684488097" lastFinishedPulling="2025-10-06 08:23:33.404016647 +0000 UTC m=+265.141766658" observedRunningTime="2025-10-06 08:23:34.001806425 +0000 UTC m=+265.739556466" watchObservedRunningTime="2025-10-06 08:23:34.003584309 +0000 UTC m=+265.741334330" Oct 06 08:23:38 crc kubenswrapper[4991]: I1006 08:23:38.695996 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:38 crc kubenswrapper[4991]: I1006 08:23:38.697584 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:38 crc kubenswrapper[4991]: I1006 08:23:38.738156 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:38 crc kubenswrapper[4991]: I1006 08:23:38.904328 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:38 crc kubenswrapper[4991]: I1006 08:23:38.904404 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:38 crc kubenswrapper[4991]: I1006 08:23:38.944563 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:39 crc kubenswrapper[4991]: I1006 08:23:39.028434 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l9kb2" Oct 06 08:23:39 crc kubenswrapper[4991]: I1006 08:23:39.034995 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p9vs2" Oct 06 08:23:41 crc kubenswrapper[4991]: I1006 08:23:41.143047 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:41 crc kubenswrapper[4991]: I1006 08:23:41.143242 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:41 crc kubenswrapper[4991]: I1006 08:23:41.188095 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:23:41 crc kubenswrapper[4991]: I1006 08:23:41.296488 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:41 crc kubenswrapper[4991]: I1006 08:23:41.296535 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:41 crc kubenswrapper[4991]: I1006 08:23:41.339782 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:42 crc kubenswrapper[4991]: I1006 08:23:42.044125 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hfklb" Oct 06 08:23:42 crc kubenswrapper[4991]: I1006 08:23:42.044522 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jzqrc" Oct 06 08:24:57 crc kubenswrapper[4991]: I1006 08:24:57.528885 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:24:57 crc kubenswrapper[4991]: I1006 08:24:57.529780 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:25:27 crc kubenswrapper[4991]: I1006 08:25:27.529413 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:25:27 crc kubenswrapper[4991]: I1006 08:25:27.530634 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:25:57 crc kubenswrapper[4991]: I1006 08:25:57.529387 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:25:57 crc kubenswrapper[4991]: I1006 08:25:57.530228 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:25:57 crc kubenswrapper[4991]: I1006 08:25:57.530376 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:25:57 crc kubenswrapper[4991]: I1006 08:25:57.531392 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9395c4bf8dda68ef7b021048ac5697dbf4d4e81b3af0b2f1dc5c5f35a3034cc5"} pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:25:57 crc kubenswrapper[4991]: I1006 08:25:57.531507 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" containerID="cri-o://9395c4bf8dda68ef7b021048ac5697dbf4d4e81b3af0b2f1dc5c5f35a3034cc5" gracePeriod=600 Oct 06 08:25:57 crc kubenswrapper[4991]: I1006 08:25:57.913648 4991 generic.go:334] "Generic (PLEG): container finished" podID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerID="9395c4bf8dda68ef7b021048ac5697dbf4d4e81b3af0b2f1dc5c5f35a3034cc5" exitCode=0 Oct 06 08:25:57 crc kubenswrapper[4991]: I1006 08:25:57.913736 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerDied","Data":"9395c4bf8dda68ef7b021048ac5697dbf4d4e81b3af0b2f1dc5c5f35a3034cc5"} Oct 06 08:25:57 crc kubenswrapper[4991]: I1006 08:25:57.914291 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"fb961ab5b435bf1e63f075c2147ea3969118f23f42842e5e0de966c0250bb8d1"} Oct 06 08:25:57 crc kubenswrapper[4991]: I1006 08:25:57.914357 4991 scope.go:117] "RemoveContainer" containerID="b8b7cf7fcec9882dbad248c522abd30ad0a62e4464ca386d04e12507a940664c" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.073863 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-m8mxf"] Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.075350 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.102036 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-m8mxf"] Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.233104 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c4d65082-c5fa-4c6e-a4e0-14329d988df0-bound-sa-token\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.233247 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c4d65082-c5fa-4c6e-a4e0-14329d988df0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.233276 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg22s\" (UniqueName: \"kubernetes.io/projected/c4d65082-c5fa-4c6e-a4e0-14329d988df0-kube-api-access-dg22s\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.233362 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4d65082-c5fa-4c6e-a4e0-14329d988df0-trusted-ca\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.233448 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c4d65082-c5fa-4c6e-a4e0-14329d988df0-registry-tls\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.233481 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c4d65082-c5fa-4c6e-a4e0-14329d988df0-registry-certificates\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.233560 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.233613 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c4d65082-c5fa-4c6e-a4e0-14329d988df0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.273181 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.335694 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c4d65082-c5fa-4c6e-a4e0-14329d988df0-registry-tls\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.335742 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c4d65082-c5fa-4c6e-a4e0-14329d988df0-registry-certificates\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.335804 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c4d65082-c5fa-4c6e-a4e0-14329d988df0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.335828 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c4d65082-c5fa-4c6e-a4e0-14329d988df0-bound-sa-token\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.335849 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c4d65082-c5fa-4c6e-a4e0-14329d988df0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.335868 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg22s\" (UniqueName: \"kubernetes.io/projected/c4d65082-c5fa-4c6e-a4e0-14329d988df0-kube-api-access-dg22s\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.335883 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4d65082-c5fa-4c6e-a4e0-14329d988df0-trusted-ca\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.338800 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c4d65082-c5fa-4c6e-a4e0-14329d988df0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.339871 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4d65082-c5fa-4c6e-a4e0-14329d988df0-trusted-ca\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.339964 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c4d65082-c5fa-4c6e-a4e0-14329d988df0-registry-certificates\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.348053 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c4d65082-c5fa-4c6e-a4e0-14329d988df0-registry-tls\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.348103 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c4d65082-c5fa-4c6e-a4e0-14329d988df0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.351629 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c4d65082-c5fa-4c6e-a4e0-14329d988df0-bound-sa-token\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.353840 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg22s\" (UniqueName: \"kubernetes.io/projected/c4d65082-c5fa-4c6e-a4e0-14329d988df0-kube-api-access-dg22s\") pod \"image-registry-66df7c8f76-m8mxf\" (UID: \"c4d65082-c5fa-4c6e-a4e0-14329d988df0\") " pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.392532 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:20 crc kubenswrapper[4991]: I1006 08:26:20.798733 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-m8mxf"] Oct 06 08:26:21 crc kubenswrapper[4991]: I1006 08:26:21.082911 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" event={"ID":"c4d65082-c5fa-4c6e-a4e0-14329d988df0","Type":"ContainerStarted","Data":"dfc7ddf30a9858fb078b0f538bdc9ca305ef5e10fed7db02f90a3d8fc7743835"} Oct 06 08:26:21 crc kubenswrapper[4991]: I1006 08:26:21.083422 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:21 crc kubenswrapper[4991]: I1006 08:26:21.083440 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" event={"ID":"c4d65082-c5fa-4c6e-a4e0-14329d988df0","Type":"ContainerStarted","Data":"9e9a0131ddbb85ba1e2355ec6dfa63a795e2ef4dfe925a42f3ba0db29473f004"} Oct 06 08:26:21 crc kubenswrapper[4991]: I1006 08:26:21.105858 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" podStartSLOduration=1.1058360839999999 podStartE2EDuration="1.105836084s" podCreationTimestamp="2025-10-06 08:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:26:21.103048407 +0000 UTC m=+432.840798448" watchObservedRunningTime="2025-10-06 08:26:21.105836084 +0000 UTC m=+432.843586105" Oct 06 08:26:40 crc kubenswrapper[4991]: I1006 08:26:40.401398 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-m8mxf" Oct 06 08:26:40 crc kubenswrapper[4991]: I1006 08:26:40.453883 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zl4k8"] Oct 06 08:27:05 crc kubenswrapper[4991]: I1006 08:27:05.500382 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" podUID="fb83cb02-67d8-4f38-aad6-001ea28de60a" containerName="registry" containerID="cri-o://b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245" gracePeriod=30 Oct 06 08:27:05 crc kubenswrapper[4991]: I1006 08:27:05.924169 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:27:05 crc kubenswrapper[4991]: I1006 08:27:05.986840 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb83cb02-67d8-4f38-aad6-001ea28de60a-registry-certificates\") pod \"fb83cb02-67d8-4f38-aad6-001ea28de60a\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " Oct 06 08:27:05 crc kubenswrapper[4991]: I1006 08:27:05.986903 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb83cb02-67d8-4f38-aad6-001ea28de60a-trusted-ca\") pod \"fb83cb02-67d8-4f38-aad6-001ea28de60a\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " Oct 06 08:27:05 crc kubenswrapper[4991]: I1006 08:27:05.986936 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-bound-sa-token\") pod \"fb83cb02-67d8-4f38-aad6-001ea28de60a\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " Oct 06 08:27:05 crc kubenswrapper[4991]: I1006 08:27:05.987047 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb83cb02-67d8-4f38-aad6-001ea28de60a-ca-trust-extracted\") pod \"fb83cb02-67d8-4f38-aad6-001ea28de60a\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " Oct 06 08:27:05 crc kubenswrapper[4991]: I1006 08:27:05.987083 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-registry-tls\") pod \"fb83cb02-67d8-4f38-aad6-001ea28de60a\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " Oct 06 08:27:05 crc kubenswrapper[4991]: I1006 08:27:05.987121 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb83cb02-67d8-4f38-aad6-001ea28de60a-installation-pull-secrets\") pod \"fb83cb02-67d8-4f38-aad6-001ea28de60a\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " Oct 06 08:27:05 crc kubenswrapper[4991]: I1006 08:27:05.987422 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"fb83cb02-67d8-4f38-aad6-001ea28de60a\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " Oct 06 08:27:05 crc kubenswrapper[4991]: I1006 08:27:05.987486 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bzcr\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-kube-api-access-6bzcr\") pod \"fb83cb02-67d8-4f38-aad6-001ea28de60a\" (UID: \"fb83cb02-67d8-4f38-aad6-001ea28de60a\") " Oct 06 08:27:05 crc kubenswrapper[4991]: I1006 08:27:05.988043 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb83cb02-67d8-4f38-aad6-001ea28de60a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fb83cb02-67d8-4f38-aad6-001ea28de60a" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:27:05 crc kubenswrapper[4991]: I1006 08:27:05.991286 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb83cb02-67d8-4f38-aad6-001ea28de60a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fb83cb02-67d8-4f38-aad6-001ea28de60a" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:05.997719 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb83cb02-67d8-4f38-aad6-001ea28de60a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fb83cb02-67d8-4f38-aad6-001ea28de60a" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:05.997838 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fb83cb02-67d8-4f38-aad6-001ea28de60a" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:05.998495 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-kube-api-access-6bzcr" (OuterVolumeSpecName: "kube-api-access-6bzcr") pod "fb83cb02-67d8-4f38-aad6-001ea28de60a" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a"). InnerVolumeSpecName "kube-api-access-6bzcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:05.998892 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fb83cb02-67d8-4f38-aad6-001ea28de60a" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.010501 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "fb83cb02-67d8-4f38-aad6-001ea28de60a" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.016412 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb83cb02-67d8-4f38-aad6-001ea28de60a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fb83cb02-67d8-4f38-aad6-001ea28de60a" (UID: "fb83cb02-67d8-4f38-aad6-001ea28de60a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.088837 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb83cb02-67d8-4f38-aad6-001ea28de60a-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.088893 4991 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.088908 4991 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fb83cb02-67d8-4f38-aad6-001ea28de60a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.088921 4991 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.088936 4991 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fb83cb02-67d8-4f38-aad6-001ea28de60a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.088950 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bzcr\" (UniqueName: \"kubernetes.io/projected/fb83cb02-67d8-4f38-aad6-001ea28de60a-kube-api-access-6bzcr\") on node \"crc\" DevicePath \"\"" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.088961 4991 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fb83cb02-67d8-4f38-aad6-001ea28de60a-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.361073 4991 generic.go:334] "Generic (PLEG): container finished" podID="fb83cb02-67d8-4f38-aad6-001ea28de60a" containerID="b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245" exitCode=0 Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.361128 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" event={"ID":"fb83cb02-67d8-4f38-aad6-001ea28de60a","Type":"ContainerDied","Data":"b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245"} Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.361165 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" event={"ID":"fb83cb02-67d8-4f38-aad6-001ea28de60a","Type":"ContainerDied","Data":"6496040adc61a7bae86d50e9d9ead6a70f461991343f51b2c4bbbf2814ec0e0b"} Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.361173 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zl4k8" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.361187 4991 scope.go:117] "RemoveContainer" containerID="b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.377376 4991 scope.go:117] "RemoveContainer" containerID="b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245" Oct 06 08:27:06 crc kubenswrapper[4991]: E1006 08:27:06.378124 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245\": container with ID starting with b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245 not found: ID does not exist" containerID="b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.378170 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245"} err="failed to get container status \"b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245\": rpc error: code = NotFound desc = could not find container \"b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245\": container with ID starting with b150af1b0d97c40b3d1cacf3b4b3f84898001dce74ace465b19f2bf7e4a50245 not found: ID does not exist" Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.386225 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zl4k8"] Oct 06 08:27:06 crc kubenswrapper[4991]: I1006 08:27:06.389573 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zl4k8"] Oct 06 08:27:07 crc kubenswrapper[4991]: I1006 08:27:07.257989 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb83cb02-67d8-4f38-aad6-001ea28de60a" path="/var/lib/kubelet/pods/fb83cb02-67d8-4f38-aad6-001ea28de60a/volumes" Oct 06 08:27:57 crc kubenswrapper[4991]: I1006 08:27:57.528779 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:27:57 crc kubenswrapper[4991]: I1006 08:27:57.529457 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:28:27 crc kubenswrapper[4991]: I1006 08:28:27.529156 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:28:27 crc kubenswrapper[4991]: I1006 08:28:27.530009 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:28:57 crc kubenswrapper[4991]: I1006 08:28:57.529091 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:28:57 crc kubenswrapper[4991]: I1006 08:28:57.529750 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:28:57 crc kubenswrapper[4991]: I1006 08:28:57.529801 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:28:57 crc kubenswrapper[4991]: I1006 08:28:57.530436 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb961ab5b435bf1e63f075c2147ea3969118f23f42842e5e0de966c0250bb8d1"} pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:28:57 crc kubenswrapper[4991]: I1006 08:28:57.530499 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" containerID="cri-o://fb961ab5b435bf1e63f075c2147ea3969118f23f42842e5e0de966c0250bb8d1" gracePeriod=600 Oct 06 08:28:58 crc kubenswrapper[4991]: I1006 08:28:58.085851 4991 generic.go:334] "Generic (PLEG): container finished" podID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerID="fb961ab5b435bf1e63f075c2147ea3969118f23f42842e5e0de966c0250bb8d1" exitCode=0 Oct 06 08:28:58 crc kubenswrapper[4991]: I1006 08:28:58.085962 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerDied","Data":"fb961ab5b435bf1e63f075c2147ea3969118f23f42842e5e0de966c0250bb8d1"} Oct 06 08:28:58 crc kubenswrapper[4991]: I1006 08:28:58.086403 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"3169b67ddb39c04fb8f22ea7b9f7ae55cd068df65648f9ad55f3275e8f92dd3b"} Oct 06 08:28:58 crc kubenswrapper[4991]: I1006 08:28:58.086469 4991 scope.go:117] "RemoveContainer" containerID="9395c4bf8dda68ef7b021048ac5697dbf4d4e81b3af0b2f1dc5c5f35a3034cc5" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.135895 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7"] Oct 06 08:30:00 crc kubenswrapper[4991]: E1006 08:30:00.136689 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb83cb02-67d8-4f38-aad6-001ea28de60a" containerName="registry" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.136705 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb83cb02-67d8-4f38-aad6-001ea28de60a" containerName="registry" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.136842 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb83cb02-67d8-4f38-aad6-001ea28de60a" containerName="registry" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.137359 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.139269 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.139324 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.145193 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7"] Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.239250 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5114b17-056d-477b-a8dc-a798a3afcc68-config-volume\") pod \"collect-profiles-29328990-4cnh7\" (UID: \"b5114b17-056d-477b-a8dc-a798a3afcc68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.239385 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k9nj\" (UniqueName: \"kubernetes.io/projected/b5114b17-056d-477b-a8dc-a798a3afcc68-kube-api-access-2k9nj\") pod \"collect-profiles-29328990-4cnh7\" (UID: \"b5114b17-056d-477b-a8dc-a798a3afcc68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.239419 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5114b17-056d-477b-a8dc-a798a3afcc68-secret-volume\") pod \"collect-profiles-29328990-4cnh7\" (UID: \"b5114b17-056d-477b-a8dc-a798a3afcc68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.340725 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k9nj\" (UniqueName: \"kubernetes.io/projected/b5114b17-056d-477b-a8dc-a798a3afcc68-kube-api-access-2k9nj\") pod \"collect-profiles-29328990-4cnh7\" (UID: \"b5114b17-056d-477b-a8dc-a798a3afcc68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.340789 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5114b17-056d-477b-a8dc-a798a3afcc68-secret-volume\") pod \"collect-profiles-29328990-4cnh7\" (UID: \"b5114b17-056d-477b-a8dc-a798a3afcc68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.340835 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5114b17-056d-477b-a8dc-a798a3afcc68-config-volume\") pod \"collect-profiles-29328990-4cnh7\" (UID: \"b5114b17-056d-477b-a8dc-a798a3afcc68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.342355 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5114b17-056d-477b-a8dc-a798a3afcc68-config-volume\") pod \"collect-profiles-29328990-4cnh7\" (UID: \"b5114b17-056d-477b-a8dc-a798a3afcc68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.351734 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5114b17-056d-477b-a8dc-a798a3afcc68-secret-volume\") pod \"collect-profiles-29328990-4cnh7\" (UID: \"b5114b17-056d-477b-a8dc-a798a3afcc68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.357408 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k9nj\" (UniqueName: \"kubernetes.io/projected/b5114b17-056d-477b-a8dc-a798a3afcc68-kube-api-access-2k9nj\") pod \"collect-profiles-29328990-4cnh7\" (UID: \"b5114b17-056d-477b-a8dc-a798a3afcc68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.454657 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:00 crc kubenswrapper[4991]: I1006 08:30:00.633128 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7"] Oct 06 08:30:01 crc kubenswrapper[4991]: I1006 08:30:01.478388 4991 generic.go:334] "Generic (PLEG): container finished" podID="b5114b17-056d-477b-a8dc-a798a3afcc68" containerID="917828eb243f9e19a7a9efc0cc5298dd05329374558b46f8b77eb29e8b020124" exitCode=0 Oct 06 08:30:01 crc kubenswrapper[4991]: I1006 08:30:01.478514 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" event={"ID":"b5114b17-056d-477b-a8dc-a798a3afcc68","Type":"ContainerDied","Data":"917828eb243f9e19a7a9efc0cc5298dd05329374558b46f8b77eb29e8b020124"} Oct 06 08:30:01 crc kubenswrapper[4991]: I1006 08:30:01.478658 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" event={"ID":"b5114b17-056d-477b-a8dc-a798a3afcc68","Type":"ContainerStarted","Data":"6eadfb6b5f39b97e1a5e000f4ea4a9ee9b661e668a3e8a3d8de668be52cdb33a"} Oct 06 08:30:02 crc kubenswrapper[4991]: I1006 08:30:02.717994 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:02 crc kubenswrapper[4991]: I1006 08:30:02.876322 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k9nj\" (UniqueName: \"kubernetes.io/projected/b5114b17-056d-477b-a8dc-a798a3afcc68-kube-api-access-2k9nj\") pod \"b5114b17-056d-477b-a8dc-a798a3afcc68\" (UID: \"b5114b17-056d-477b-a8dc-a798a3afcc68\") " Oct 06 08:30:02 crc kubenswrapper[4991]: I1006 08:30:02.876373 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5114b17-056d-477b-a8dc-a798a3afcc68-secret-volume\") pod \"b5114b17-056d-477b-a8dc-a798a3afcc68\" (UID: \"b5114b17-056d-477b-a8dc-a798a3afcc68\") " Oct 06 08:30:02 crc kubenswrapper[4991]: I1006 08:30:02.876418 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5114b17-056d-477b-a8dc-a798a3afcc68-config-volume\") pod \"b5114b17-056d-477b-a8dc-a798a3afcc68\" (UID: \"b5114b17-056d-477b-a8dc-a798a3afcc68\") " Oct 06 08:30:02 crc kubenswrapper[4991]: I1006 08:30:02.877077 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5114b17-056d-477b-a8dc-a798a3afcc68-config-volume" (OuterVolumeSpecName: "config-volume") pod "b5114b17-056d-477b-a8dc-a798a3afcc68" (UID: "b5114b17-056d-477b-a8dc-a798a3afcc68"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:30:02 crc kubenswrapper[4991]: I1006 08:30:02.881791 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5114b17-056d-477b-a8dc-a798a3afcc68-kube-api-access-2k9nj" (OuterVolumeSpecName: "kube-api-access-2k9nj") pod "b5114b17-056d-477b-a8dc-a798a3afcc68" (UID: "b5114b17-056d-477b-a8dc-a798a3afcc68"). InnerVolumeSpecName "kube-api-access-2k9nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:30:02 crc kubenswrapper[4991]: I1006 08:30:02.881948 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5114b17-056d-477b-a8dc-a798a3afcc68-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b5114b17-056d-477b-a8dc-a798a3afcc68" (UID: "b5114b17-056d-477b-a8dc-a798a3afcc68"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:30:02 crc kubenswrapper[4991]: I1006 08:30:02.978256 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k9nj\" (UniqueName: \"kubernetes.io/projected/b5114b17-056d-477b-a8dc-a798a3afcc68-kube-api-access-2k9nj\") on node \"crc\" DevicePath \"\"" Oct 06 08:30:02 crc kubenswrapper[4991]: I1006 08:30:02.978308 4991 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5114b17-056d-477b-a8dc-a798a3afcc68-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:30:02 crc kubenswrapper[4991]: I1006 08:30:02.978318 4991 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5114b17-056d-477b-a8dc-a798a3afcc68-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:30:03 crc kubenswrapper[4991]: I1006 08:30:03.492543 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" event={"ID":"b5114b17-056d-477b-a8dc-a798a3afcc68","Type":"ContainerDied","Data":"6eadfb6b5f39b97e1a5e000f4ea4a9ee9b661e668a3e8a3d8de668be52cdb33a"} Oct 06 08:30:03 crc kubenswrapper[4991]: I1006 08:30:03.492633 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eadfb6b5f39b97e1a5e000f4ea4a9ee9b661e668a3e8a3d8de668be52cdb33a" Oct 06 08:30:03 crc kubenswrapper[4991]: I1006 08:30:03.492637 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29328990-4cnh7" Oct 06 08:30:57 crc kubenswrapper[4991]: I1006 08:30:57.529645 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:30:57 crc kubenswrapper[4991]: I1006 08:30:57.530485 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:31:27 crc kubenswrapper[4991]: I1006 08:31:27.529629 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:31:27 crc kubenswrapper[4991]: I1006 08:31:27.530275 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.097043 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz6jp"] Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.097898 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" podUID="4a605716-cfa0-49ed-826e-bb9b2cd4d834" containerName="controller-manager" containerID="cri-o://c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d" gracePeriod=30 Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.178790 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp"] Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.179154 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" podUID="15148cd6-6d64-4a92-a334-b5014bf8b05a" containerName="route-controller-manager" containerID="cri-o://054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6" gracePeriod=30 Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.496613 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.556557 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.650687 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl7r2\" (UniqueName: \"kubernetes.io/projected/15148cd6-6d64-4a92-a334-b5014bf8b05a-kube-api-access-gl7r2\") pod \"15148cd6-6d64-4a92-a334-b5014bf8b05a\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.650727 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15148cd6-6d64-4a92-a334-b5014bf8b05a-config\") pod \"15148cd6-6d64-4a92-a334-b5014bf8b05a\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.650797 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15148cd6-6d64-4a92-a334-b5014bf8b05a-client-ca\") pod \"15148cd6-6d64-4a92-a334-b5014bf8b05a\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.650814 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-config\") pod \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.650845 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-proxy-ca-bundles\") pod \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.650862 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76drl\" (UniqueName: \"kubernetes.io/projected/4a605716-cfa0-49ed-826e-bb9b2cd4d834-kube-api-access-76drl\") pod \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.650911 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-client-ca\") pod \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.650933 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15148cd6-6d64-4a92-a334-b5014bf8b05a-serving-cert\") pod \"15148cd6-6d64-4a92-a334-b5014bf8b05a\" (UID: \"15148cd6-6d64-4a92-a334-b5014bf8b05a\") " Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.650962 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a605716-cfa0-49ed-826e-bb9b2cd4d834-serving-cert\") pod \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\" (UID: \"4a605716-cfa0-49ed-826e-bb9b2cd4d834\") " Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.651355 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15148cd6-6d64-4a92-a334-b5014bf8b05a-config" (OuterVolumeSpecName: "config") pod "15148cd6-6d64-4a92-a334-b5014bf8b05a" (UID: "15148cd6-6d64-4a92-a334-b5014bf8b05a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.651899 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15148cd6-6d64-4a92-a334-b5014bf8b05a-client-ca" (OuterVolumeSpecName: "client-ca") pod "15148cd6-6d64-4a92-a334-b5014bf8b05a" (UID: "15148cd6-6d64-4a92-a334-b5014bf8b05a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.652013 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4a605716-cfa0-49ed-826e-bb9b2cd4d834" (UID: "4a605716-cfa0-49ed-826e-bb9b2cd4d834"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.652137 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-client-ca" (OuterVolumeSpecName: "client-ca") pod "4a605716-cfa0-49ed-826e-bb9b2cd4d834" (UID: "4a605716-cfa0-49ed-826e-bb9b2cd4d834"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.652257 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-config" (OuterVolumeSpecName: "config") pod "4a605716-cfa0-49ed-826e-bb9b2cd4d834" (UID: "4a605716-cfa0-49ed-826e-bb9b2cd4d834"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.656239 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a605716-cfa0-49ed-826e-bb9b2cd4d834-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4a605716-cfa0-49ed-826e-bb9b2cd4d834" (UID: "4a605716-cfa0-49ed-826e-bb9b2cd4d834"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.656559 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a605716-cfa0-49ed-826e-bb9b2cd4d834-kube-api-access-76drl" (OuterVolumeSpecName: "kube-api-access-76drl") pod "4a605716-cfa0-49ed-826e-bb9b2cd4d834" (UID: "4a605716-cfa0-49ed-826e-bb9b2cd4d834"). InnerVolumeSpecName "kube-api-access-76drl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.656693 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15148cd6-6d64-4a92-a334-b5014bf8b05a-kube-api-access-gl7r2" (OuterVolumeSpecName: "kube-api-access-gl7r2") pod "15148cd6-6d64-4a92-a334-b5014bf8b05a" (UID: "15148cd6-6d64-4a92-a334-b5014bf8b05a"). InnerVolumeSpecName "kube-api-access-gl7r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.659598 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15148cd6-6d64-4a92-a334-b5014bf8b05a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "15148cd6-6d64-4a92-a334-b5014bf8b05a" (UID: "15148cd6-6d64-4a92-a334-b5014bf8b05a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.752687 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.752714 4991 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15148cd6-6d64-4a92-a334-b5014bf8b05a-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.752743 4991 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.752757 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76drl\" (UniqueName: \"kubernetes.io/projected/4a605716-cfa0-49ed-826e-bb9b2cd4d834-kube-api-access-76drl\") on node \"crc\" DevicePath \"\"" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.752767 4991 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a605716-cfa0-49ed-826e-bb9b2cd4d834-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.752774 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15148cd6-6d64-4a92-a334-b5014bf8b05a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.752782 4991 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a605716-cfa0-49ed-826e-bb9b2cd4d834-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.752789 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl7r2\" (UniqueName: \"kubernetes.io/projected/15148cd6-6d64-4a92-a334-b5014bf8b05a-kube-api-access-gl7r2\") on node \"crc\" DevicePath \"\"" Oct 06 08:31:48 crc kubenswrapper[4991]: I1006 08:31:48.752797 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15148cd6-6d64-4a92-a334-b5014bf8b05a-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.130817 4991 generic.go:334] "Generic (PLEG): container finished" podID="4a605716-cfa0-49ed-826e-bb9b2cd4d834" containerID="c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d" exitCode=0 Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.130912 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" event={"ID":"4a605716-cfa0-49ed-826e-bb9b2cd4d834","Type":"ContainerDied","Data":"c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d"} Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.130945 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" event={"ID":"4a605716-cfa0-49ed-826e-bb9b2cd4d834","Type":"ContainerDied","Data":"5798e322bd783c09421f45c2f0867315f2abf1ce387a8b2466521d160ffcfd58"} Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.130970 4991 scope.go:117] "RemoveContainer" containerID="c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d" Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.131092 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pz6jp" Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.137423 4991 generic.go:334] "Generic (PLEG): container finished" podID="15148cd6-6d64-4a92-a334-b5014bf8b05a" containerID="054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6" exitCode=0 Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.137477 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" event={"ID":"15148cd6-6d64-4a92-a334-b5014bf8b05a","Type":"ContainerDied","Data":"054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6"} Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.137494 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.137535 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp" event={"ID":"15148cd6-6d64-4a92-a334-b5014bf8b05a","Type":"ContainerDied","Data":"1389990a6108879aa898d1496125e481db9802b3cda22e58d802ec547dd3b43c"} Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.167721 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz6jp"] Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.171735 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz6jp"] Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.172284 4991 scope.go:117] "RemoveContainer" containerID="c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d" Oct 06 08:31:49 crc kubenswrapper[4991]: E1006 08:31:49.174352 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d\": container with ID starting with c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d not found: ID does not exist" containerID="c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d" Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.174398 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d"} err="failed to get container status \"c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d\": rpc error: code = NotFound desc = could not find container \"c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d\": container with ID starting with c61b82e644bb15a70a94960b2fb97995a17ec187339cbd358c92c9f5f584bf2d not found: ID does not exist" Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.174426 4991 scope.go:117] "RemoveContainer" containerID="054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6" Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.192158 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp"] Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.194897 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9wqrp"] Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.195242 4991 scope.go:117] "RemoveContainer" containerID="054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6" Oct 06 08:31:49 crc kubenswrapper[4991]: E1006 08:31:49.195849 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6\": container with ID starting with 054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6 not found: ID does not exist" containerID="054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6" Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.195907 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6"} err="failed to get container status \"054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6\": rpc error: code = NotFound desc = could not find container \"054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6\": container with ID starting with 054403adcfae8f7f553f40574870c8723186b4f36e3f4143c5d93e029501bcd6 not found: ID does not exist" Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.254838 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15148cd6-6d64-4a92-a334-b5014bf8b05a" path="/var/lib/kubelet/pods/15148cd6-6d64-4a92-a334-b5014bf8b05a/volumes" Oct 06 08:31:49 crc kubenswrapper[4991]: I1006 08:31:49.256760 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a605716-cfa0-49ed-826e-bb9b2cd4d834" path="/var/lib/kubelet/pods/4a605716-cfa0-49ed-826e-bb9b2cd4d834/volumes" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.208780 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-849695955-l9ft4"] Oct 06 08:31:50 crc kubenswrapper[4991]: E1006 08:31:50.209141 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5114b17-056d-477b-a8dc-a798a3afcc68" containerName="collect-profiles" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.209165 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5114b17-056d-477b-a8dc-a798a3afcc68" containerName="collect-profiles" Oct 06 08:31:50 crc kubenswrapper[4991]: E1006 08:31:50.209184 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15148cd6-6d64-4a92-a334-b5014bf8b05a" containerName="route-controller-manager" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.209198 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="15148cd6-6d64-4a92-a334-b5014bf8b05a" containerName="route-controller-manager" Oct 06 08:31:50 crc kubenswrapper[4991]: E1006 08:31:50.209229 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a605716-cfa0-49ed-826e-bb9b2cd4d834" containerName="controller-manager" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.209243 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a605716-cfa0-49ed-826e-bb9b2cd4d834" containerName="controller-manager" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.209453 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a605716-cfa0-49ed-826e-bb9b2cd4d834" containerName="controller-manager" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.209473 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5114b17-056d-477b-a8dc-a798a3afcc68" containerName="collect-profiles" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.209499 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="15148cd6-6d64-4a92-a334-b5014bf8b05a" containerName="route-controller-manager" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.210078 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.211477 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8"] Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.212042 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.213707 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.213765 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.214200 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.214205 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.214243 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.214667 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.214963 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.216116 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.216145 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.216286 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.216769 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.218263 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.227324 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.228237 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8"] Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.260894 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-849695955-l9ft4"] Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.373122 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b95eb93-99fd-4006-ada6-a060aa186ef2-proxy-ca-bundles\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.373443 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b95eb93-99fd-4006-ada6-a060aa186ef2-config\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.373544 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfr72\" (UniqueName: \"kubernetes.io/projected/a3052a47-67dc-4e20-9a83-46e5b90412a8-kube-api-access-nfr72\") pod \"route-controller-manager-5c5649dd4f-sphh8\" (UID: \"a3052a47-67dc-4e20-9a83-46e5b90412a8\") " pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.373638 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b95eb93-99fd-4006-ada6-a060aa186ef2-client-ca\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.373746 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3052a47-67dc-4e20-9a83-46e5b90412a8-serving-cert\") pod \"route-controller-manager-5c5649dd4f-sphh8\" (UID: \"a3052a47-67dc-4e20-9a83-46e5b90412a8\") " pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.373853 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3052a47-67dc-4e20-9a83-46e5b90412a8-config\") pod \"route-controller-manager-5c5649dd4f-sphh8\" (UID: \"a3052a47-67dc-4e20-9a83-46e5b90412a8\") " pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.373961 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b95eb93-99fd-4006-ada6-a060aa186ef2-serving-cert\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.374058 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r8jh\" (UniqueName: \"kubernetes.io/projected/5b95eb93-99fd-4006-ada6-a060aa186ef2-kube-api-access-6r8jh\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.374203 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3052a47-67dc-4e20-9a83-46e5b90412a8-client-ca\") pod \"route-controller-manager-5c5649dd4f-sphh8\" (UID: \"a3052a47-67dc-4e20-9a83-46e5b90412a8\") " pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.475310 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3052a47-67dc-4e20-9a83-46e5b90412a8-client-ca\") pod \"route-controller-manager-5c5649dd4f-sphh8\" (UID: \"a3052a47-67dc-4e20-9a83-46e5b90412a8\") " pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.475369 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b95eb93-99fd-4006-ada6-a060aa186ef2-proxy-ca-bundles\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.475395 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b95eb93-99fd-4006-ada6-a060aa186ef2-config\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.475412 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfr72\" (UniqueName: \"kubernetes.io/projected/a3052a47-67dc-4e20-9a83-46e5b90412a8-kube-api-access-nfr72\") pod \"route-controller-manager-5c5649dd4f-sphh8\" (UID: \"a3052a47-67dc-4e20-9a83-46e5b90412a8\") " pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.475427 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b95eb93-99fd-4006-ada6-a060aa186ef2-client-ca\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.475447 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3052a47-67dc-4e20-9a83-46e5b90412a8-serving-cert\") pod \"route-controller-manager-5c5649dd4f-sphh8\" (UID: \"a3052a47-67dc-4e20-9a83-46e5b90412a8\") " pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.475464 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3052a47-67dc-4e20-9a83-46e5b90412a8-config\") pod \"route-controller-manager-5c5649dd4f-sphh8\" (UID: \"a3052a47-67dc-4e20-9a83-46e5b90412a8\") " pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.475483 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b95eb93-99fd-4006-ada6-a060aa186ef2-serving-cert\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.475508 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r8jh\" (UniqueName: \"kubernetes.io/projected/5b95eb93-99fd-4006-ada6-a060aa186ef2-kube-api-access-6r8jh\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.477352 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b95eb93-99fd-4006-ada6-a060aa186ef2-client-ca\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.477744 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b95eb93-99fd-4006-ada6-a060aa186ef2-proxy-ca-bundles\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.477951 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3052a47-67dc-4e20-9a83-46e5b90412a8-client-ca\") pod \"route-controller-manager-5c5649dd4f-sphh8\" (UID: \"a3052a47-67dc-4e20-9a83-46e5b90412a8\") " pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.478801 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3052a47-67dc-4e20-9a83-46e5b90412a8-config\") pod \"route-controller-manager-5c5649dd4f-sphh8\" (UID: \"a3052a47-67dc-4e20-9a83-46e5b90412a8\") " pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.479120 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b95eb93-99fd-4006-ada6-a060aa186ef2-config\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.493319 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b95eb93-99fd-4006-ada6-a060aa186ef2-serving-cert\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.493439 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3052a47-67dc-4e20-9a83-46e5b90412a8-serving-cert\") pod \"route-controller-manager-5c5649dd4f-sphh8\" (UID: \"a3052a47-67dc-4e20-9a83-46e5b90412a8\") " pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.497337 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r8jh\" (UniqueName: \"kubernetes.io/projected/5b95eb93-99fd-4006-ada6-a060aa186ef2-kube-api-access-6r8jh\") pod \"controller-manager-849695955-l9ft4\" (UID: \"5b95eb93-99fd-4006-ada6-a060aa186ef2\") " pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.507796 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfr72\" (UniqueName: \"kubernetes.io/projected/a3052a47-67dc-4e20-9a83-46e5b90412a8-kube-api-access-nfr72\") pod \"route-controller-manager-5c5649dd4f-sphh8\" (UID: \"a3052a47-67dc-4e20-9a83-46e5b90412a8\") " pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.532111 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.541652 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.754779 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-849695955-l9ft4"] Oct 06 08:31:50 crc kubenswrapper[4991]: I1006 08:31:50.793808 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8"] Oct 06 08:31:51 crc kubenswrapper[4991]: I1006 08:31:51.181733 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" event={"ID":"a3052a47-67dc-4e20-9a83-46e5b90412a8","Type":"ContainerStarted","Data":"7fdfeacab0cfb03a8b8f3280a43fdfd4e9a757d79e5a436d40a848d531b41bb0"} Oct 06 08:31:51 crc kubenswrapper[4991]: I1006 08:31:51.182075 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" event={"ID":"a3052a47-67dc-4e20-9a83-46e5b90412a8","Type":"ContainerStarted","Data":"5966147a3971b6aa119b9a3dbfbf7c298274d2e9dbb94db8c0b232256c173ce2"} Oct 06 08:31:51 crc kubenswrapper[4991]: I1006 08:31:51.182452 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:51 crc kubenswrapper[4991]: I1006 08:31:51.187851 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-849695955-l9ft4" event={"ID":"5b95eb93-99fd-4006-ada6-a060aa186ef2","Type":"ContainerStarted","Data":"ce3ba8f5f3d415912158eae3cccf839daa17013de80fc1265bd6663eccbba28b"} Oct 06 08:31:51 crc kubenswrapper[4991]: I1006 08:31:51.187904 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-849695955-l9ft4" event={"ID":"5b95eb93-99fd-4006-ada6-a060aa186ef2","Type":"ContainerStarted","Data":"d909b8ddd66fb76239524309c2bc57fa2742c175216045dbcf74a32f3be6d813"} Oct 06 08:31:51 crc kubenswrapper[4991]: I1006 08:31:51.188632 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:51 crc kubenswrapper[4991]: I1006 08:31:51.202980 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" podStartSLOduration=3.2029608 podStartE2EDuration="3.2029608s" podCreationTimestamp="2025-10-06 08:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:31:51.201461488 +0000 UTC m=+762.939211529" watchObservedRunningTime="2025-10-06 08:31:51.2029608 +0000 UTC m=+762.940710831" Oct 06 08:31:51 crc kubenswrapper[4991]: I1006 08:31:51.217486 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-849695955-l9ft4" Oct 06 08:31:51 crc kubenswrapper[4991]: I1006 08:31:51.226483 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-849695955-l9ft4" podStartSLOduration=3.226460832 podStartE2EDuration="3.226460832s" podCreationTimestamp="2025-10-06 08:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:31:51.221132737 +0000 UTC m=+762.958882778" watchObservedRunningTime="2025-10-06 08:31:51.226460832 +0000 UTC m=+762.964210863" Oct 06 08:31:51 crc kubenswrapper[4991]: I1006 08:31:51.524067 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c5649dd4f-sphh8" Oct 06 08:31:54 crc kubenswrapper[4991]: I1006 08:31:54.565058 4991 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 08:31:57 crc kubenswrapper[4991]: I1006 08:31:57.529514 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:31:57 crc kubenswrapper[4991]: I1006 08:31:57.529957 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:31:57 crc kubenswrapper[4991]: I1006 08:31:57.530037 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:31:57 crc kubenswrapper[4991]: I1006 08:31:57.530786 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3169b67ddb39c04fb8f22ea7b9f7ae55cd068df65648f9ad55f3275e8f92dd3b"} pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:31:57 crc kubenswrapper[4991]: I1006 08:31:57.530856 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" containerID="cri-o://3169b67ddb39c04fb8f22ea7b9f7ae55cd068df65648f9ad55f3275e8f92dd3b" gracePeriod=600 Oct 06 08:31:58 crc kubenswrapper[4991]: I1006 08:31:58.230440 4991 generic.go:334] "Generic (PLEG): container finished" podID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerID="3169b67ddb39c04fb8f22ea7b9f7ae55cd068df65648f9ad55f3275e8f92dd3b" exitCode=0 Oct 06 08:31:58 crc kubenswrapper[4991]: I1006 08:31:58.230552 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerDied","Data":"3169b67ddb39c04fb8f22ea7b9f7ae55cd068df65648f9ad55f3275e8f92dd3b"} Oct 06 08:31:58 crc kubenswrapper[4991]: I1006 08:31:58.230706 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"ee6239739727eb6d7bb018a70f54ea31ce396adbac7977b5d2326c033722faa0"} Oct 06 08:31:58 crc kubenswrapper[4991]: I1006 08:31:58.230731 4991 scope.go:117] "RemoveContainer" containerID="fb961ab5b435bf1e63f075c2147ea3969118f23f42842e5e0de966c0250bb8d1" Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.117935 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-97r99"] Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.120027 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.135122 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-97r99"] Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.201521 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeebf0b9-1177-4117-931d-a67db6bfe581-catalog-content\") pod \"certified-operators-97r99\" (UID: \"eeebf0b9-1177-4117-931d-a67db6bfe581\") " pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.202006 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeebf0b9-1177-4117-931d-a67db6bfe581-utilities\") pod \"certified-operators-97r99\" (UID: \"eeebf0b9-1177-4117-931d-a67db6bfe581\") " pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.202085 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ltqc\" (UniqueName: \"kubernetes.io/projected/eeebf0b9-1177-4117-931d-a67db6bfe581-kube-api-access-4ltqc\") pod \"certified-operators-97r99\" (UID: \"eeebf0b9-1177-4117-931d-a67db6bfe581\") " pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.304277 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeebf0b9-1177-4117-931d-a67db6bfe581-utilities\") pod \"certified-operators-97r99\" (UID: \"eeebf0b9-1177-4117-931d-a67db6bfe581\") " pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.304395 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ltqc\" (UniqueName: \"kubernetes.io/projected/eeebf0b9-1177-4117-931d-a67db6bfe581-kube-api-access-4ltqc\") pod \"certified-operators-97r99\" (UID: \"eeebf0b9-1177-4117-931d-a67db6bfe581\") " pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.304466 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeebf0b9-1177-4117-931d-a67db6bfe581-catalog-content\") pod \"certified-operators-97r99\" (UID: \"eeebf0b9-1177-4117-931d-a67db6bfe581\") " pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.305148 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeebf0b9-1177-4117-931d-a67db6bfe581-catalog-content\") pod \"certified-operators-97r99\" (UID: \"eeebf0b9-1177-4117-931d-a67db6bfe581\") " pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.305528 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeebf0b9-1177-4117-931d-a67db6bfe581-utilities\") pod \"certified-operators-97r99\" (UID: \"eeebf0b9-1177-4117-931d-a67db6bfe581\") " pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.348364 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ltqc\" (UniqueName: \"kubernetes.io/projected/eeebf0b9-1177-4117-931d-a67db6bfe581-kube-api-access-4ltqc\") pod \"certified-operators-97r99\" (UID: \"eeebf0b9-1177-4117-931d-a67db6bfe581\") " pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.444833 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:01 crc kubenswrapper[4991]: I1006 08:32:01.842342 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-97r99"] Oct 06 08:32:01 crc kubenswrapper[4991]: W1006 08:32:01.850935 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeebf0b9_1177_4117_931d_a67db6bfe581.slice/crio-18e04623302a38f66a3454814e408077449aac7d288d0897d562e8c9692a252c WatchSource:0}: Error finding container 18e04623302a38f66a3454814e408077449aac7d288d0897d562e8c9692a252c: Status 404 returned error can't find the container with id 18e04623302a38f66a3454814e408077449aac7d288d0897d562e8c9692a252c Oct 06 08:32:02 crc kubenswrapper[4991]: I1006 08:32:02.257599 4991 generic.go:334] "Generic (PLEG): container finished" podID="eeebf0b9-1177-4117-931d-a67db6bfe581" containerID="eb3f4efa69fbdcf0a804ce25c05f2be78ece5dfb93663cbc05062c1362bfe401" exitCode=0 Oct 06 08:32:02 crc kubenswrapper[4991]: I1006 08:32:02.257701 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97r99" event={"ID":"eeebf0b9-1177-4117-931d-a67db6bfe581","Type":"ContainerDied","Data":"eb3f4efa69fbdcf0a804ce25c05f2be78ece5dfb93663cbc05062c1362bfe401"} Oct 06 08:32:02 crc kubenswrapper[4991]: I1006 08:32:02.257929 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97r99" event={"ID":"eeebf0b9-1177-4117-931d-a67db6bfe581","Type":"ContainerStarted","Data":"18e04623302a38f66a3454814e408077449aac7d288d0897d562e8c9692a252c"} Oct 06 08:32:02 crc kubenswrapper[4991]: I1006 08:32:02.260002 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:32:03 crc kubenswrapper[4991]: I1006 08:32:03.263820 4991 generic.go:334] "Generic (PLEG): container finished" podID="eeebf0b9-1177-4117-931d-a67db6bfe581" containerID="c7ddc6421230a7eb6ba534e4a7425ce65675b6d0c51655da4742945cbf13f1a9" exitCode=0 Oct 06 08:32:03 crc kubenswrapper[4991]: I1006 08:32:03.263903 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97r99" event={"ID":"eeebf0b9-1177-4117-931d-a67db6bfe581","Type":"ContainerDied","Data":"c7ddc6421230a7eb6ba534e4a7425ce65675b6d0c51655da4742945cbf13f1a9"} Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.088993 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wlm55"] Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.090395 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.110371 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlm55"] Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.239667 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9333541d-b671-4a7b-b3a7-8d9646850c0d-catalog-content\") pod \"redhat-marketplace-wlm55\" (UID: \"9333541d-b671-4a7b-b3a7-8d9646850c0d\") " pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.239718 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d6vd\" (UniqueName: \"kubernetes.io/projected/9333541d-b671-4a7b-b3a7-8d9646850c0d-kube-api-access-4d6vd\") pod \"redhat-marketplace-wlm55\" (UID: \"9333541d-b671-4a7b-b3a7-8d9646850c0d\") " pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.239745 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9333541d-b671-4a7b-b3a7-8d9646850c0d-utilities\") pod \"redhat-marketplace-wlm55\" (UID: \"9333541d-b671-4a7b-b3a7-8d9646850c0d\") " pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.270961 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97r99" event={"ID":"eeebf0b9-1177-4117-931d-a67db6bfe581","Type":"ContainerStarted","Data":"a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775"} Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.292646 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-97r99" podStartSLOduration=1.7412926199999998 podStartE2EDuration="3.29262674s" podCreationTimestamp="2025-10-06 08:32:01 +0000 UTC" firstStartedPulling="2025-10-06 08:32:02.259716326 +0000 UTC m=+773.997466347" lastFinishedPulling="2025-10-06 08:32:03.811050446 +0000 UTC m=+775.548800467" observedRunningTime="2025-10-06 08:32:04.289152325 +0000 UTC m=+776.026902346" watchObservedRunningTime="2025-10-06 08:32:04.29262674 +0000 UTC m=+776.030376761" Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.341373 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9333541d-b671-4a7b-b3a7-8d9646850c0d-catalog-content\") pod \"redhat-marketplace-wlm55\" (UID: \"9333541d-b671-4a7b-b3a7-8d9646850c0d\") " pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.341436 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d6vd\" (UniqueName: \"kubernetes.io/projected/9333541d-b671-4a7b-b3a7-8d9646850c0d-kube-api-access-4d6vd\") pod \"redhat-marketplace-wlm55\" (UID: \"9333541d-b671-4a7b-b3a7-8d9646850c0d\") " pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.341477 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9333541d-b671-4a7b-b3a7-8d9646850c0d-utilities\") pod \"redhat-marketplace-wlm55\" (UID: \"9333541d-b671-4a7b-b3a7-8d9646850c0d\") " pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.341923 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9333541d-b671-4a7b-b3a7-8d9646850c0d-utilities\") pod \"redhat-marketplace-wlm55\" (UID: \"9333541d-b671-4a7b-b3a7-8d9646850c0d\") " pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.342134 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9333541d-b671-4a7b-b3a7-8d9646850c0d-catalog-content\") pod \"redhat-marketplace-wlm55\" (UID: \"9333541d-b671-4a7b-b3a7-8d9646850c0d\") " pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.360678 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d6vd\" (UniqueName: \"kubernetes.io/projected/9333541d-b671-4a7b-b3a7-8d9646850c0d-kube-api-access-4d6vd\") pod \"redhat-marketplace-wlm55\" (UID: \"9333541d-b671-4a7b-b3a7-8d9646850c0d\") " pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.408801 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:04 crc kubenswrapper[4991]: I1006 08:32:04.829360 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlm55"] Oct 06 08:32:05 crc kubenswrapper[4991]: I1006 08:32:05.279221 4991 generic.go:334] "Generic (PLEG): container finished" podID="9333541d-b671-4a7b-b3a7-8d9646850c0d" containerID="0a0171907646cd3c22ba2751cf49d5a7f546a236285f2c0d6e93f2f55cca544e" exitCode=0 Oct 06 08:32:05 crc kubenswrapper[4991]: I1006 08:32:05.279343 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlm55" event={"ID":"9333541d-b671-4a7b-b3a7-8d9646850c0d","Type":"ContainerDied","Data":"0a0171907646cd3c22ba2751cf49d5a7f546a236285f2c0d6e93f2f55cca544e"} Oct 06 08:32:05 crc kubenswrapper[4991]: I1006 08:32:05.279695 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlm55" event={"ID":"9333541d-b671-4a7b-b3a7-8d9646850c0d","Type":"ContainerStarted","Data":"43a8465af9b1627724ca473ceb17359b0ccf9dbc512997a8d70b67944be208eb"} Oct 06 08:32:06 crc kubenswrapper[4991]: I1006 08:32:06.289256 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlm55" event={"ID":"9333541d-b671-4a7b-b3a7-8d9646850c0d","Type":"ContainerStarted","Data":"6f7aad2f23605efc0dd6573003fa3e3d1d2e476f3c5b4fe7841512b60da8da19"} Oct 06 08:32:07 crc kubenswrapper[4991]: I1006 08:32:07.298114 4991 generic.go:334] "Generic (PLEG): container finished" podID="9333541d-b671-4a7b-b3a7-8d9646850c0d" containerID="6f7aad2f23605efc0dd6573003fa3e3d1d2e476f3c5b4fe7841512b60da8da19" exitCode=0 Oct 06 08:32:07 crc kubenswrapper[4991]: I1006 08:32:07.298282 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlm55" event={"ID":"9333541d-b671-4a7b-b3a7-8d9646850c0d","Type":"ContainerDied","Data":"6f7aad2f23605efc0dd6573003fa3e3d1d2e476f3c5b4fe7841512b60da8da19"} Oct 06 08:32:08 crc kubenswrapper[4991]: I1006 08:32:08.307064 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlm55" event={"ID":"9333541d-b671-4a7b-b3a7-8d9646850c0d","Type":"ContainerStarted","Data":"ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68"} Oct 06 08:32:08 crc kubenswrapper[4991]: I1006 08:32:08.332960 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wlm55" podStartSLOduration=1.926162108 podStartE2EDuration="4.332932341s" podCreationTimestamp="2025-10-06 08:32:04 +0000 UTC" firstStartedPulling="2025-10-06 08:32:05.28141005 +0000 UTC m=+777.019160071" lastFinishedPulling="2025-10-06 08:32:07.688180283 +0000 UTC m=+779.425930304" observedRunningTime="2025-10-06 08:32:08.326233107 +0000 UTC m=+780.063983188" watchObservedRunningTime="2025-10-06 08:32:08.332932341 +0000 UTC m=+780.070682402" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.236537 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qwljw"] Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.237136 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovn-controller" containerID="cri-o://fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.237199 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="nbdb" containerID="cri-o://023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.237240 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="northd" containerID="cri-o://f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.237257 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="sbdb" containerID="cri-o://62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.237269 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.237289 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovn-acl-logging" containerID="cri-o://b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.237338 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="kube-rbac-proxy-node" containerID="cri-o://8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.278649 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" containerID="cri-o://def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb" gracePeriod=30 Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.329689 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjvmw_58386a1a-6047-42ce-a952-43f397822919/kube-multus/2.log" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.330509 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjvmw_58386a1a-6047-42ce-a952-43f397822919/kube-multus/1.log" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.330559 4991 generic.go:334] "Generic (PLEG): container finished" podID="58386a1a-6047-42ce-a952-43f397822919" containerID="9b6902fadf422e50276f1e9aed20f9eb81e712f105467693407490a695638a3f" exitCode=2 Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.330587 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjvmw" event={"ID":"58386a1a-6047-42ce-a952-43f397822919","Type":"ContainerDied","Data":"9b6902fadf422e50276f1e9aed20f9eb81e712f105467693407490a695638a3f"} Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.330616 4991 scope.go:117] "RemoveContainer" containerID="e035d37b9b1d03636577807941d3ba2a897d5e7f540283e75e5311b9d83a3771" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.331111 4991 scope.go:117] "RemoveContainer" containerID="9b6902fadf422e50276f1e9aed20f9eb81e712f105467693407490a695638a3f" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.445005 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.445837 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.490477 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.565766 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/3.log" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.567847 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovn-acl-logging/0.log" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.568408 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovn-controller/0.log" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.568781 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.624802 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-prbg4"] Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625050 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625075 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625089 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625098 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625107 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="kube-rbac-proxy-node" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625114 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="kube-rbac-proxy-node" Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625129 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625137 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625152 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="kubecfg-setup" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625160 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="kubecfg-setup" Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625168 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="sbdb" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625175 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="sbdb" Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625183 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovn-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625191 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovn-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625205 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="nbdb" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625212 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="nbdb" Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625222 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625231 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625243 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625251 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625260 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="northd" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625268 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="northd" Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625280 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovn-acl-logging" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625288 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovn-acl-logging" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625432 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovn-acl-logging" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625447 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625458 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625467 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="northd" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625477 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625485 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="kube-rbac-proxy-node" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625499 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625507 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="nbdb" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625517 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovn-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625527 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="sbdb" Oct 06 08:32:11 crc kubenswrapper[4991]: E1006 08:32:11.625629 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625640 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625745 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.625754 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerName="ovnkube-controller" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.635446 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666242 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-run-netns\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666331 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666535 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-ovn-node-metrics-cert\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666582 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-etc-openvswitch\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666607 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-kubelet\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666640 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-cni-bin\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666664 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-ovnkube-script-lib\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666691 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrv6t\" (UniqueName: \"kubernetes.io/projected/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-kube-api-access-vrv6t\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666716 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666746 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-run-netns\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666774 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-run-openvswitch\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666800 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-run-ovn\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666822 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-var-lib-openvswitch\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666843 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-ovnkube-config\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666874 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-log-socket\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666896 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-env-overrides\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666925 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-cni-netd\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666952 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-slash\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.666981 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-run-ovn-kubernetes\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.667007 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-run-systemd\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.667033 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-node-log\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.667056 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-systemd-units\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.667095 4991 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.767948 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-run-ovn-kubernetes\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.767992 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-log-socket\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768016 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-openvswitch\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768036 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-systemd-units\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768091 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovnkube-script-lib\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768120 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768162 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-node-log\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768189 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-env-overrides\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768216 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovnkube-config\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768236 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-var-lib-openvswitch\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768269 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-cni-bin\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768327 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-slash\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768379 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-systemd\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768401 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-kubelet\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768436 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmj9m\" (UniqueName: \"kubernetes.io/projected/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-kube-api-access-cmj9m\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768461 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-ovn\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768493 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovn-node-metrics-cert\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768512 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-cni-netd\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768532 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-etc-openvswitch\") pod \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\" (UID: \"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e\") " Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768683 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-cni-bin\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768715 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-ovnkube-script-lib\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768739 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrv6t\" (UniqueName: \"kubernetes.io/projected/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-kube-api-access-vrv6t\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768763 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768790 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-run-netns\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768818 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-run-openvswitch\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768843 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-run-ovn\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768866 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-var-lib-openvswitch\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768888 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-ovnkube-config\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768916 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-log-socket\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768938 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-env-overrides\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768966 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-cni-netd\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.768993 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-slash\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769019 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-run-ovn-kubernetes\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769045 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-run-systemd\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769065 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-node-log\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769085 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-systemd-units\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769117 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-ovn-node-metrics-cert\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769140 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-etc-openvswitch\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769164 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-kubelet\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769244 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-kubelet\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769320 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769355 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-log-socket" (OuterVolumeSpecName: "log-socket") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769378 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769399 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769893 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769945 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-cni-netd\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769928 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769968 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769989 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-slash\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769984 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-run-netns\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770007 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770018 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-run-openvswitch\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.769966 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-run-ovn\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770062 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-run-ovn-kubernetes\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770081 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-run-systemd\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770101 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-node-log\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770116 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-systemd-units\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770133 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-host-cni-bin\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770277 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-etc-openvswitch\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770497 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-var-lib-openvswitch\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770520 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770544 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770560 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-node-log" (OuterVolumeSpecName: "node-log") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770645 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770704 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770757 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-slash" (OuterVolumeSpecName: "host-slash") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770804 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770845 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770880 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770904 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-ovnkube-script-lib\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.770915 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-log-socket\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.771028 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-ovnkube-config\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.772151 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-env-overrides\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.775912 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.776002 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-ovn-node-metrics-cert\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.785383 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.790989 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-kube-api-access-cmj9m" (OuterVolumeSpecName: "kube-api-access-cmj9m") pod "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" (UID: "977b0faa-5b3d-4e9d-bef4-ba47f8764c6e"). InnerVolumeSpecName "kube-api-access-cmj9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.793598 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrv6t\" (UniqueName: \"kubernetes.io/projected/e30e40ed-6f4f-41c5-a89d-bdc8352f10ff-kube-api-access-vrv6t\") pod \"ovnkube-node-prbg4\" (UID: \"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff\") " pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.870543 4991 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.870939 4991 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.870967 4991 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.870993 4991 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871019 4991 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-log-socket\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871044 4991 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871067 4991 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871089 4991 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871108 4991 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871125 4991 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-node-log\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871143 4991 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871160 4991 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871176 4991 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871192 4991 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871208 4991 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-slash\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871223 4991 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871241 4991 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871256 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmj9m\" (UniqueName: \"kubernetes.io/projected/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-kube-api-access-cmj9m\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.871272 4991 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:11 crc kubenswrapper[4991]: I1006 08:32:11.954146 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:11 crc kubenswrapper[4991]: W1006 08:32:11.979052 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode30e40ed_6f4f_41c5_a89d_bdc8352f10ff.slice/crio-0c4d02aa904d7669b49c75e7e612edfc6040a758433e5ed558727736f4cdd490 WatchSource:0}: Error finding container 0c4d02aa904d7669b49c75e7e612edfc6040a758433e5ed558727736f4cdd490: Status 404 returned error can't find the container with id 0c4d02aa904d7669b49c75e7e612edfc6040a758433e5ed558727736f4cdd490 Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.341618 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovnkube-controller/3.log" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.344953 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovn-acl-logging/0.log" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.345769 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qwljw_977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/ovn-controller/0.log" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346437 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb" exitCode=0 Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346486 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119" exitCode=0 Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346511 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114" exitCode=0 Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346503 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346562 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346587 4991 scope.go:117] "RemoveContainer" containerID="def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346529 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7" exitCode=0 Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346660 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6" exitCode=0 Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346681 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee" exitCode=0 Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346698 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21" exitCode=143 Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346712 4991 generic.go:334] "Generic (PLEG): container finished" podID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" containerID="fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d" exitCode=143 Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346567 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346801 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346845 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346879 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346898 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346919 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346937 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346948 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346962 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346976 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.346991 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347005 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347019 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347033 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347054 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347076 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347094 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347108 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347121 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347134 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347149 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347163 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347178 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347191 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347205 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347225 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347248 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347265 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347279 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347329 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347347 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347362 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347378 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347393 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347407 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347421 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347441 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qwljw" event={"ID":"977b0faa-5b3d-4e9d-bef4-ba47f8764c6e","Type":"ContainerDied","Data":"0327300df1417f6fb788ab88272e82c63b85a47a4f13f7399314ae024c9a0093"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347466 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347482 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347495 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347509 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347522 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347537 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347550 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347564 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347577 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.347587 4991 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.354082 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjvmw_58386a1a-6047-42ce-a952-43f397822919/kube-multus/2.log" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.354200 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjvmw" event={"ID":"58386a1a-6047-42ce-a952-43f397822919","Type":"ContainerStarted","Data":"d3d43eb6cc0c5a62f4ea388deb198ebe6e1c3dae39c4b6160f72924666e48b4f"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.357857 4991 generic.go:334] "Generic (PLEG): container finished" podID="e30e40ed-6f4f-41c5-a89d-bdc8352f10ff" containerID="8f92eadbbe45b736c72f7ac0cb6410ef4c715c78a46a97a1251043a63a4a2ded" exitCode=0 Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.357899 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" event={"ID":"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff","Type":"ContainerDied","Data":"8f92eadbbe45b736c72f7ac0cb6410ef4c715c78a46a97a1251043a63a4a2ded"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.357986 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" event={"ID":"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff","Type":"ContainerStarted","Data":"0c4d02aa904d7669b49c75e7e612edfc6040a758433e5ed558727736f4cdd490"} Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.381826 4991 scope.go:117] "RemoveContainer" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.403244 4991 scope.go:117] "RemoveContainer" containerID="62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.435537 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qwljw"] Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.439270 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qwljw"] Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.441158 4991 scope.go:117] "RemoveContainer" containerID="023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.444675 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.486476 4991 scope.go:117] "RemoveContainer" containerID="f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.527747 4991 scope.go:117] "RemoveContainer" containerID="af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.528084 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-97r99"] Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.549750 4991 scope.go:117] "RemoveContainer" containerID="8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.565258 4991 scope.go:117] "RemoveContainer" containerID="b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.578462 4991 scope.go:117] "RemoveContainer" containerID="fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.592568 4991 scope.go:117] "RemoveContainer" containerID="451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.622886 4991 scope.go:117] "RemoveContainer" containerID="def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb" Oct 06 08:32:12 crc kubenswrapper[4991]: E1006 08:32:12.623483 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb\": container with ID starting with def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb not found: ID does not exist" containerID="def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.623532 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb"} err="failed to get container status \"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb\": rpc error: code = NotFound desc = could not find container \"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb\": container with ID starting with def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.623561 4991 scope.go:117] "RemoveContainer" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" Oct 06 08:32:12 crc kubenswrapper[4991]: E1006 08:32:12.624054 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\": container with ID starting with 5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c not found: ID does not exist" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.624078 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c"} err="failed to get container status \"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\": rpc error: code = NotFound desc = could not find container \"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\": container with ID starting with 5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.624091 4991 scope.go:117] "RemoveContainer" containerID="62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119" Oct 06 08:32:12 crc kubenswrapper[4991]: E1006 08:32:12.624373 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\": container with ID starting with 62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119 not found: ID does not exist" containerID="62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.624395 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119"} err="failed to get container status \"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\": rpc error: code = NotFound desc = could not find container \"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\": container with ID starting with 62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.624408 4991 scope.go:117] "RemoveContainer" containerID="023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114" Oct 06 08:32:12 crc kubenswrapper[4991]: E1006 08:32:12.624696 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\": container with ID starting with 023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114 not found: ID does not exist" containerID="023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.624713 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114"} err="failed to get container status \"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\": rpc error: code = NotFound desc = could not find container \"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\": container with ID starting with 023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.624725 4991 scope.go:117] "RemoveContainer" containerID="f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7" Oct 06 08:32:12 crc kubenswrapper[4991]: E1006 08:32:12.625095 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\": container with ID starting with f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7 not found: ID does not exist" containerID="f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.625114 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7"} err="failed to get container status \"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\": rpc error: code = NotFound desc = could not find container \"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\": container with ID starting with f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.625128 4991 scope.go:117] "RemoveContainer" containerID="af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6" Oct 06 08:32:12 crc kubenswrapper[4991]: E1006 08:32:12.629460 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\": container with ID starting with af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6 not found: ID does not exist" containerID="af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.629670 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6"} err="failed to get container status \"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\": rpc error: code = NotFound desc = could not find container \"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\": container with ID starting with af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.629685 4991 scope.go:117] "RemoveContainer" containerID="8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee" Oct 06 08:32:12 crc kubenswrapper[4991]: E1006 08:32:12.630833 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\": container with ID starting with 8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee not found: ID does not exist" containerID="8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.630871 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee"} err="failed to get container status \"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\": rpc error: code = NotFound desc = could not find container \"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\": container with ID starting with 8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.630885 4991 scope.go:117] "RemoveContainer" containerID="b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21" Oct 06 08:32:12 crc kubenswrapper[4991]: E1006 08:32:12.631089 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\": container with ID starting with b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21 not found: ID does not exist" containerID="b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.631119 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21"} err="failed to get container status \"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\": rpc error: code = NotFound desc = could not find container \"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\": container with ID starting with b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.631130 4991 scope.go:117] "RemoveContainer" containerID="fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d" Oct 06 08:32:12 crc kubenswrapper[4991]: E1006 08:32:12.631336 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\": container with ID starting with fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d not found: ID does not exist" containerID="fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.631355 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d"} err="failed to get container status \"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\": rpc error: code = NotFound desc = could not find container \"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\": container with ID starting with fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.631371 4991 scope.go:117] "RemoveContainer" containerID="451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83" Oct 06 08:32:12 crc kubenswrapper[4991]: E1006 08:32:12.631558 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\": container with ID starting with 451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83 not found: ID does not exist" containerID="451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.631627 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83"} err="failed to get container status \"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\": rpc error: code = NotFound desc = could not find container \"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\": container with ID starting with 451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.631683 4991 scope.go:117] "RemoveContainer" containerID="def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.631901 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb"} err="failed to get container status \"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb\": rpc error: code = NotFound desc = could not find container \"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb\": container with ID starting with def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.631928 4991 scope.go:117] "RemoveContainer" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.632495 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c"} err="failed to get container status \"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\": rpc error: code = NotFound desc = could not find container \"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\": container with ID starting with 5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.632522 4991 scope.go:117] "RemoveContainer" containerID="62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.632732 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119"} err="failed to get container status \"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\": rpc error: code = NotFound desc = could not find container \"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\": container with ID starting with 62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.632744 4991 scope.go:117] "RemoveContainer" containerID="023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.632906 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114"} err="failed to get container status \"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\": rpc error: code = NotFound desc = could not find container \"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\": container with ID starting with 023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.632920 4991 scope.go:117] "RemoveContainer" containerID="f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.633085 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7"} err="failed to get container status \"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\": rpc error: code = NotFound desc = could not find container \"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\": container with ID starting with f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.633097 4991 scope.go:117] "RemoveContainer" containerID="af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.633404 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6"} err="failed to get container status \"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\": rpc error: code = NotFound desc = could not find container \"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\": container with ID starting with af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.633424 4991 scope.go:117] "RemoveContainer" containerID="8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.633598 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee"} err="failed to get container status \"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\": rpc error: code = NotFound desc = could not find container \"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\": container with ID starting with 8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.633632 4991 scope.go:117] "RemoveContainer" containerID="b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.633994 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21"} err="failed to get container status \"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\": rpc error: code = NotFound desc = could not find container \"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\": container with ID starting with b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.634109 4991 scope.go:117] "RemoveContainer" containerID="fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.634440 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d"} err="failed to get container status \"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\": rpc error: code = NotFound desc = could not find container \"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\": container with ID starting with fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.634581 4991 scope.go:117] "RemoveContainer" containerID="451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.637265 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83"} err="failed to get container status \"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\": rpc error: code = NotFound desc = could not find container \"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\": container with ID starting with 451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.637450 4991 scope.go:117] "RemoveContainer" containerID="def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.637959 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb"} err="failed to get container status \"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb\": rpc error: code = NotFound desc = could not find container \"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb\": container with ID starting with def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.638083 4991 scope.go:117] "RemoveContainer" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.638499 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c"} err="failed to get container status \"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\": rpc error: code = NotFound desc = could not find container \"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\": container with ID starting with 5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.638527 4991 scope.go:117] "RemoveContainer" containerID="62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.638848 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119"} err="failed to get container status \"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\": rpc error: code = NotFound desc = could not find container \"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\": container with ID starting with 62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.638868 4991 scope.go:117] "RemoveContainer" containerID="023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.639707 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114"} err="failed to get container status \"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\": rpc error: code = NotFound desc = could not find container \"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\": container with ID starting with 023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.639722 4991 scope.go:117] "RemoveContainer" containerID="f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.640090 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7"} err="failed to get container status \"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\": rpc error: code = NotFound desc = could not find container \"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\": container with ID starting with f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.640207 4991 scope.go:117] "RemoveContainer" containerID="af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.640593 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6"} err="failed to get container status \"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\": rpc error: code = NotFound desc = could not find container \"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\": container with ID starting with af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.640616 4991 scope.go:117] "RemoveContainer" containerID="8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.640835 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee"} err="failed to get container status \"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\": rpc error: code = NotFound desc = could not find container \"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\": container with ID starting with 8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.640854 4991 scope.go:117] "RemoveContainer" containerID="b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.641069 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21"} err="failed to get container status \"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\": rpc error: code = NotFound desc = could not find container \"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\": container with ID starting with b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.641087 4991 scope.go:117] "RemoveContainer" containerID="fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.641265 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d"} err="failed to get container status \"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\": rpc error: code = NotFound desc = could not find container \"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\": container with ID starting with fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.641284 4991 scope.go:117] "RemoveContainer" containerID="451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.641987 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83"} err="failed to get container status \"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\": rpc error: code = NotFound desc = could not find container \"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\": container with ID starting with 451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.642051 4991 scope.go:117] "RemoveContainer" containerID="def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.643818 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb"} err="failed to get container status \"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb\": rpc error: code = NotFound desc = could not find container \"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb\": container with ID starting with def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.643913 4991 scope.go:117] "RemoveContainer" containerID="5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.644325 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c"} err="failed to get container status \"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\": rpc error: code = NotFound desc = could not find container \"5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c\": container with ID starting with 5e7fcf967fbbd62b452e813de374524dabcf9990d42d2c1a8a5ebd0ab526067c not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.644432 4991 scope.go:117] "RemoveContainer" containerID="62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.645033 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119"} err="failed to get container status \"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\": rpc error: code = NotFound desc = could not find container \"62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119\": container with ID starting with 62a2f07ee9690dcc1f4483e8256945e4c8c49e01107abf93c111603a0a4ac119 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.645121 4991 scope.go:117] "RemoveContainer" containerID="023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.645497 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114"} err="failed to get container status \"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\": rpc error: code = NotFound desc = could not find container \"023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114\": container with ID starting with 023cfe643023f0e632c0b4572ba548f2db6c37841bf28d60047de29899233114 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.645600 4991 scope.go:117] "RemoveContainer" containerID="f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.645988 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7"} err="failed to get container status \"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\": rpc error: code = NotFound desc = could not find container \"f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7\": container with ID starting with f22e60648b44228653d64ed192b9dd3e7f62c7b61761b8d38e7aeadbac14fff7 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.646102 4991 scope.go:117] "RemoveContainer" containerID="af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.646612 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6"} err="failed to get container status \"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\": rpc error: code = NotFound desc = could not find container \"af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6\": container with ID starting with af9b5760f042843e29a8069b6af83ed381972a5bd4c5bcdc2533b4d9fcc5efb6 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.646712 4991 scope.go:117] "RemoveContainer" containerID="8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.647061 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee"} err="failed to get container status \"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\": rpc error: code = NotFound desc = could not find container \"8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee\": container with ID starting with 8b2e00803f6a4c75fb9da245e25dabf5e46757d75e06337bb02b56c6f9c52bee not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.647085 4991 scope.go:117] "RemoveContainer" containerID="b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.647420 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21"} err="failed to get container status \"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\": rpc error: code = NotFound desc = could not find container \"b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21\": container with ID starting with b41bb0141ed9f2cb588d2f27e15c78c0edabfb829d3ff42360f090eb592ebc21 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.647517 4991 scope.go:117] "RemoveContainer" containerID="fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.647887 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d"} err="failed to get container status \"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\": rpc error: code = NotFound desc = could not find container \"fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d\": container with ID starting with fd6b6f73aafaada6093181e9b5a009a363fb05c0cd10f73aa6cac15455ed071d not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.647984 4991 scope.go:117] "RemoveContainer" containerID="451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.648362 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83"} err="failed to get container status \"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\": rpc error: code = NotFound desc = could not find container \"451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83\": container with ID starting with 451ddbcf125349519392e8be87f8781ff33738a203711f65689684275b1d6f83 not found: ID does not exist" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.648463 4991 scope.go:117] "RemoveContainer" containerID="def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb" Oct 06 08:32:12 crc kubenswrapper[4991]: I1006 08:32:12.648750 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb"} err="failed to get container status \"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb\": rpc error: code = NotFound desc = could not find container \"def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb\": container with ID starting with def8a54d38fcfccd5bafcc01c59546ce491490c86c84bf54a2d001bc549f4dfb not found: ID does not exist" Oct 06 08:32:13 crc kubenswrapper[4991]: I1006 08:32:13.253171 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="977b0faa-5b3d-4e9d-bef4-ba47f8764c6e" path="/var/lib/kubelet/pods/977b0faa-5b3d-4e9d-bef4-ba47f8764c6e/volumes" Oct 06 08:32:13 crc kubenswrapper[4991]: I1006 08:32:13.366527 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" event={"ID":"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff","Type":"ContainerStarted","Data":"90fa1ea82dafe02b64d8934e2df009af3f5d80213243fdc7f23e876ac94d7cd8"} Oct 06 08:32:13 crc kubenswrapper[4991]: I1006 08:32:13.366566 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" event={"ID":"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff","Type":"ContainerStarted","Data":"c77cb10efd183ecea838f537a678ab801334da28ae4fddd58eece005bb11819c"} Oct 06 08:32:13 crc kubenswrapper[4991]: I1006 08:32:13.366576 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" event={"ID":"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff","Type":"ContainerStarted","Data":"d44460fb6b733da2054ebd3405f9eba389b9b17cf7db9802ee7ee28312ee8b9e"} Oct 06 08:32:13 crc kubenswrapper[4991]: I1006 08:32:13.366588 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" event={"ID":"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff","Type":"ContainerStarted","Data":"9820bc885fc33a2152e271216d8bf73e00e8433a07cbf7a376f9aac3e8b4495b"} Oct 06 08:32:13 crc kubenswrapper[4991]: I1006 08:32:13.366597 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" event={"ID":"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff","Type":"ContainerStarted","Data":"90478af2b6f85f25da94df0b6b9eb84013696d76c3a0e07a4a61a0dcf4303ad5"} Oct 06 08:32:13 crc kubenswrapper[4991]: I1006 08:32:13.366605 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" event={"ID":"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff","Type":"ContainerStarted","Data":"25d1fec3e356251e11315fdc837298840c98484f90a2e6a7a2d4c3d2c5185bff"} Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.375011 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-97r99" podUID="eeebf0b9-1177-4117-931d-a67db6bfe581" containerName="registry-server" containerID="cri-o://a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775" gracePeriod=2 Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.409527 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.409591 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.452348 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.560528 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.605320 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeebf0b9-1177-4117-931d-a67db6bfe581-catalog-content\") pod \"eeebf0b9-1177-4117-931d-a67db6bfe581\" (UID: \"eeebf0b9-1177-4117-931d-a67db6bfe581\") " Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.605605 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeebf0b9-1177-4117-931d-a67db6bfe581-utilities\") pod \"eeebf0b9-1177-4117-931d-a67db6bfe581\" (UID: \"eeebf0b9-1177-4117-931d-a67db6bfe581\") " Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.605685 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ltqc\" (UniqueName: \"kubernetes.io/projected/eeebf0b9-1177-4117-931d-a67db6bfe581-kube-api-access-4ltqc\") pod \"eeebf0b9-1177-4117-931d-a67db6bfe581\" (UID: \"eeebf0b9-1177-4117-931d-a67db6bfe581\") " Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.606514 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeebf0b9-1177-4117-931d-a67db6bfe581-utilities" (OuterVolumeSpecName: "utilities") pod "eeebf0b9-1177-4117-931d-a67db6bfe581" (UID: "eeebf0b9-1177-4117-931d-a67db6bfe581"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.606934 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeebf0b9-1177-4117-931d-a67db6bfe581-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.613985 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeebf0b9-1177-4117-931d-a67db6bfe581-kube-api-access-4ltqc" (OuterVolumeSpecName: "kube-api-access-4ltqc") pod "eeebf0b9-1177-4117-931d-a67db6bfe581" (UID: "eeebf0b9-1177-4117-931d-a67db6bfe581"). InnerVolumeSpecName "kube-api-access-4ltqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.653903 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeebf0b9-1177-4117-931d-a67db6bfe581-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eeebf0b9-1177-4117-931d-a67db6bfe581" (UID: "eeebf0b9-1177-4117-931d-a67db6bfe581"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.708071 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeebf0b9-1177-4117-931d-a67db6bfe581-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:14 crc kubenswrapper[4991]: I1006 08:32:14.708136 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ltqc\" (UniqueName: \"kubernetes.io/projected/eeebf0b9-1177-4117-931d-a67db6bfe581-kube-api-access-4ltqc\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.391031 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" event={"ID":"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff","Type":"ContainerStarted","Data":"d46be28be5b3dd5353bb9ef515a7abe006bf20ae390510651a4a91a54edc113b"} Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.394098 4991 generic.go:334] "Generic (PLEG): container finished" podID="eeebf0b9-1177-4117-931d-a67db6bfe581" containerID="a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775" exitCode=0 Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.394161 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97r99" Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.394177 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97r99" event={"ID":"eeebf0b9-1177-4117-931d-a67db6bfe581","Type":"ContainerDied","Data":"a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775"} Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.394481 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97r99" event={"ID":"eeebf0b9-1177-4117-931d-a67db6bfe581","Type":"ContainerDied","Data":"18e04623302a38f66a3454814e408077449aac7d288d0897d562e8c9692a252c"} Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.394528 4991 scope.go:117] "RemoveContainer" containerID="a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775" Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.418142 4991 scope.go:117] "RemoveContainer" containerID="c7ddc6421230a7eb6ba534e4a7425ce65675b6d0c51655da4742945cbf13f1a9" Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.420910 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-97r99"] Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.429072 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-97r99"] Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.445742 4991 scope.go:117] "RemoveContainer" containerID="eb3f4efa69fbdcf0a804ce25c05f2be78ece5dfb93663cbc05062c1362bfe401" Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.462604 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.465542 4991 scope.go:117] "RemoveContainer" containerID="a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775" Oct 06 08:32:15 crc kubenswrapper[4991]: E1006 08:32:15.466006 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775\": container with ID starting with a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775 not found: ID does not exist" containerID="a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775" Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.466065 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775"} err="failed to get container status \"a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775\": rpc error: code = NotFound desc = could not find container \"a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775\": container with ID starting with a5f4b3cab1745e0de9ef01227bce9656234c45c9ab80af475bd2b9acb0361775 not found: ID does not exist" Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.466097 4991 scope.go:117] "RemoveContainer" containerID="c7ddc6421230a7eb6ba534e4a7425ce65675b6d0c51655da4742945cbf13f1a9" Oct 06 08:32:15 crc kubenswrapper[4991]: E1006 08:32:15.466538 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7ddc6421230a7eb6ba534e4a7425ce65675b6d0c51655da4742945cbf13f1a9\": container with ID starting with c7ddc6421230a7eb6ba534e4a7425ce65675b6d0c51655da4742945cbf13f1a9 not found: ID does not exist" containerID="c7ddc6421230a7eb6ba534e4a7425ce65675b6d0c51655da4742945cbf13f1a9" Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.466588 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7ddc6421230a7eb6ba534e4a7425ce65675b6d0c51655da4742945cbf13f1a9"} err="failed to get container status \"c7ddc6421230a7eb6ba534e4a7425ce65675b6d0c51655da4742945cbf13f1a9\": rpc error: code = NotFound desc = could not find container \"c7ddc6421230a7eb6ba534e4a7425ce65675b6d0c51655da4742945cbf13f1a9\": container with ID starting with c7ddc6421230a7eb6ba534e4a7425ce65675b6d0c51655da4742945cbf13f1a9 not found: ID does not exist" Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.466619 4991 scope.go:117] "RemoveContainer" containerID="eb3f4efa69fbdcf0a804ce25c05f2be78ece5dfb93663cbc05062c1362bfe401" Oct 06 08:32:15 crc kubenswrapper[4991]: E1006 08:32:15.467125 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb3f4efa69fbdcf0a804ce25c05f2be78ece5dfb93663cbc05062c1362bfe401\": container with ID starting with eb3f4efa69fbdcf0a804ce25c05f2be78ece5dfb93663cbc05062c1362bfe401 not found: ID does not exist" containerID="eb3f4efa69fbdcf0a804ce25c05f2be78ece5dfb93663cbc05062c1362bfe401" Oct 06 08:32:15 crc kubenswrapper[4991]: I1006 08:32:15.467204 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb3f4efa69fbdcf0a804ce25c05f2be78ece5dfb93663cbc05062c1362bfe401"} err="failed to get container status \"eb3f4efa69fbdcf0a804ce25c05f2be78ece5dfb93663cbc05062c1362bfe401\": rpc error: code = NotFound desc = could not find container \"eb3f4efa69fbdcf0a804ce25c05f2be78ece5dfb93663cbc05062c1362bfe401\": container with ID starting with eb3f4efa69fbdcf0a804ce25c05f2be78ece5dfb93663cbc05062c1362bfe401 not found: ID does not exist" Oct 06 08:32:17 crc kubenswrapper[4991]: I1006 08:32:17.254951 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeebf0b9-1177-4117-931d-a67db6bfe581" path="/var/lib/kubelet/pods/eeebf0b9-1177-4117-931d-a67db6bfe581/volumes" Oct 06 08:32:17 crc kubenswrapper[4991]: I1006 08:32:17.721222 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlm55"] Oct 06 08:32:17 crc kubenswrapper[4991]: I1006 08:32:17.721596 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wlm55" podUID="9333541d-b671-4a7b-b3a7-8d9646850c0d" containerName="registry-server" containerID="cri-o://ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68" gracePeriod=2 Oct 06 08:32:18 crc kubenswrapper[4991]: I1006 08:32:18.417538 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" event={"ID":"e30e40ed-6f4f-41c5-a89d-bdc8352f10ff","Type":"ContainerStarted","Data":"246f836c9441ec7d0d44574b34031b595e5848d281b83b454c25d5bc7a6376f4"} Oct 06 08:32:18 crc kubenswrapper[4991]: I1006 08:32:18.417987 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:18 crc kubenswrapper[4991]: I1006 08:32:18.418022 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:18 crc kubenswrapper[4991]: I1006 08:32:18.418049 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:18 crc kubenswrapper[4991]: I1006 08:32:18.452155 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:18 crc kubenswrapper[4991]: I1006 08:32:18.457007 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" podStartSLOduration=7.456984603 podStartE2EDuration="7.456984603s" podCreationTimestamp="2025-10-06 08:32:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:32:18.452634963 +0000 UTC m=+790.190384994" watchObservedRunningTime="2025-10-06 08:32:18.456984603 +0000 UTC m=+790.194734634" Oct 06 08:32:18 crc kubenswrapper[4991]: I1006 08:32:18.460585 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.060042 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.166689 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9333541d-b671-4a7b-b3a7-8d9646850c0d-catalog-content\") pod \"9333541d-b671-4a7b-b3a7-8d9646850c0d\" (UID: \"9333541d-b671-4a7b-b3a7-8d9646850c0d\") " Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.166823 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d6vd\" (UniqueName: \"kubernetes.io/projected/9333541d-b671-4a7b-b3a7-8d9646850c0d-kube-api-access-4d6vd\") pod \"9333541d-b671-4a7b-b3a7-8d9646850c0d\" (UID: \"9333541d-b671-4a7b-b3a7-8d9646850c0d\") " Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.166868 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9333541d-b671-4a7b-b3a7-8d9646850c0d-utilities\") pod \"9333541d-b671-4a7b-b3a7-8d9646850c0d\" (UID: \"9333541d-b671-4a7b-b3a7-8d9646850c0d\") " Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.167946 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9333541d-b671-4a7b-b3a7-8d9646850c0d-utilities" (OuterVolumeSpecName: "utilities") pod "9333541d-b671-4a7b-b3a7-8d9646850c0d" (UID: "9333541d-b671-4a7b-b3a7-8d9646850c0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.174105 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9333541d-b671-4a7b-b3a7-8d9646850c0d-kube-api-access-4d6vd" (OuterVolumeSpecName: "kube-api-access-4d6vd") pod "9333541d-b671-4a7b-b3a7-8d9646850c0d" (UID: "9333541d-b671-4a7b-b3a7-8d9646850c0d"). InnerVolumeSpecName "kube-api-access-4d6vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.189793 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9333541d-b671-4a7b-b3a7-8d9646850c0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9333541d-b671-4a7b-b3a7-8d9646850c0d" (UID: "9333541d-b671-4a7b-b3a7-8d9646850c0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.268270 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d6vd\" (UniqueName: \"kubernetes.io/projected/9333541d-b671-4a7b-b3a7-8d9646850c0d-kube-api-access-4d6vd\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.268319 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9333541d-b671-4a7b-b3a7-8d9646850c0d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.268331 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9333541d-b671-4a7b-b3a7-8d9646850c0d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.427011 4991 generic.go:334] "Generic (PLEG): container finished" podID="9333541d-b671-4a7b-b3a7-8d9646850c0d" containerID="ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68" exitCode=0 Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.427129 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlm55" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.427129 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlm55" event={"ID":"9333541d-b671-4a7b-b3a7-8d9646850c0d","Type":"ContainerDied","Data":"ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68"} Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.427368 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlm55" event={"ID":"9333541d-b671-4a7b-b3a7-8d9646850c0d","Type":"ContainerDied","Data":"43a8465af9b1627724ca473ceb17359b0ccf9dbc512997a8d70b67944be208eb"} Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.427401 4991 scope.go:117] "RemoveContainer" containerID="ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.442761 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlm55"] Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.448571 4991 scope.go:117] "RemoveContainer" containerID="6f7aad2f23605efc0dd6573003fa3e3d1d2e476f3c5b4fe7841512b60da8da19" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.452606 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlm55"] Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.466071 4991 scope.go:117] "RemoveContainer" containerID="0a0171907646cd3c22ba2751cf49d5a7f546a236285f2c0d6e93f2f55cca544e" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.498434 4991 scope.go:117] "RemoveContainer" containerID="ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68" Oct 06 08:32:19 crc kubenswrapper[4991]: E1006 08:32:19.498810 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68\": container with ID starting with ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68 not found: ID does not exist" containerID="ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.498858 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68"} err="failed to get container status \"ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68\": rpc error: code = NotFound desc = could not find container \"ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68\": container with ID starting with ac63517eda96e7f4252fa2722103d098185392bd6233320bab7f749d9822de68 not found: ID does not exist" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.498888 4991 scope.go:117] "RemoveContainer" containerID="6f7aad2f23605efc0dd6573003fa3e3d1d2e476f3c5b4fe7841512b60da8da19" Oct 06 08:32:19 crc kubenswrapper[4991]: E1006 08:32:19.499186 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f7aad2f23605efc0dd6573003fa3e3d1d2e476f3c5b4fe7841512b60da8da19\": container with ID starting with 6f7aad2f23605efc0dd6573003fa3e3d1d2e476f3c5b4fe7841512b60da8da19 not found: ID does not exist" containerID="6f7aad2f23605efc0dd6573003fa3e3d1d2e476f3c5b4fe7841512b60da8da19" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.499227 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f7aad2f23605efc0dd6573003fa3e3d1d2e476f3c5b4fe7841512b60da8da19"} err="failed to get container status \"6f7aad2f23605efc0dd6573003fa3e3d1d2e476f3c5b4fe7841512b60da8da19\": rpc error: code = NotFound desc = could not find container \"6f7aad2f23605efc0dd6573003fa3e3d1d2e476f3c5b4fe7841512b60da8da19\": container with ID starting with 6f7aad2f23605efc0dd6573003fa3e3d1d2e476f3c5b4fe7841512b60da8da19 not found: ID does not exist" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.499257 4991 scope.go:117] "RemoveContainer" containerID="0a0171907646cd3c22ba2751cf49d5a7f546a236285f2c0d6e93f2f55cca544e" Oct 06 08:32:19 crc kubenswrapper[4991]: E1006 08:32:19.499555 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a0171907646cd3c22ba2751cf49d5a7f546a236285f2c0d6e93f2f55cca544e\": container with ID starting with 0a0171907646cd3c22ba2751cf49d5a7f546a236285f2c0d6e93f2f55cca544e not found: ID does not exist" containerID="0a0171907646cd3c22ba2751cf49d5a7f546a236285f2c0d6e93f2f55cca544e" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.499581 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a0171907646cd3c22ba2751cf49d5a7f546a236285f2c0d6e93f2f55cca544e"} err="failed to get container status \"0a0171907646cd3c22ba2751cf49d5a7f546a236285f2c0d6e93f2f55cca544e\": rpc error: code = NotFound desc = could not find container \"0a0171907646cd3c22ba2751cf49d5a7f546a236285f2c0d6e93f2f55cca544e\": container with ID starting with 0a0171907646cd3c22ba2751cf49d5a7f546a236285f2c0d6e93f2f55cca544e not found: ID does not exist" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.642183 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-t7gk4"] Oct 06 08:32:19 crc kubenswrapper[4991]: E1006 08:32:19.642496 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeebf0b9-1177-4117-931d-a67db6bfe581" containerName="extract-content" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.642530 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeebf0b9-1177-4117-931d-a67db6bfe581" containerName="extract-content" Oct 06 08:32:19 crc kubenswrapper[4991]: E1006 08:32:19.642553 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeebf0b9-1177-4117-931d-a67db6bfe581" containerName="registry-server" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.642565 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeebf0b9-1177-4117-931d-a67db6bfe581" containerName="registry-server" Oct 06 08:32:19 crc kubenswrapper[4991]: E1006 08:32:19.642579 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9333541d-b671-4a7b-b3a7-8d9646850c0d" containerName="extract-content" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.642589 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9333541d-b671-4a7b-b3a7-8d9646850c0d" containerName="extract-content" Oct 06 08:32:19 crc kubenswrapper[4991]: E1006 08:32:19.642606 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9333541d-b671-4a7b-b3a7-8d9646850c0d" containerName="registry-server" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.642617 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9333541d-b671-4a7b-b3a7-8d9646850c0d" containerName="registry-server" Oct 06 08:32:19 crc kubenswrapper[4991]: E1006 08:32:19.642641 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeebf0b9-1177-4117-931d-a67db6bfe581" containerName="extract-utilities" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.642653 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeebf0b9-1177-4117-931d-a67db6bfe581" containerName="extract-utilities" Oct 06 08:32:19 crc kubenswrapper[4991]: E1006 08:32:19.642675 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9333541d-b671-4a7b-b3a7-8d9646850c0d" containerName="extract-utilities" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.642865 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9333541d-b671-4a7b-b3a7-8d9646850c0d" containerName="extract-utilities" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.643038 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeebf0b9-1177-4117-931d-a67db6bfe581" containerName="registry-server" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.643068 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="9333541d-b671-4a7b-b3a7-8d9646850c0d" containerName="registry-server" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.643650 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.645346 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.646085 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.646281 4991 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6w22z" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.646510 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.659654 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-t7gk4"] Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.774587 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/acb7d775-74d4-439c-81db-6353bd34cdfa-crc-storage\") pod \"crc-storage-crc-t7gk4\" (UID: \"acb7d775-74d4-439c-81db-6353bd34cdfa\") " pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.774798 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/acb7d775-74d4-439c-81db-6353bd34cdfa-node-mnt\") pod \"crc-storage-crc-t7gk4\" (UID: \"acb7d775-74d4-439c-81db-6353bd34cdfa\") " pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.775024 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxc4z\" (UniqueName: \"kubernetes.io/projected/acb7d775-74d4-439c-81db-6353bd34cdfa-kube-api-access-nxc4z\") pod \"crc-storage-crc-t7gk4\" (UID: \"acb7d775-74d4-439c-81db-6353bd34cdfa\") " pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.876148 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxc4z\" (UniqueName: \"kubernetes.io/projected/acb7d775-74d4-439c-81db-6353bd34cdfa-kube-api-access-nxc4z\") pod \"crc-storage-crc-t7gk4\" (UID: \"acb7d775-74d4-439c-81db-6353bd34cdfa\") " pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.876498 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/acb7d775-74d4-439c-81db-6353bd34cdfa-crc-storage\") pod \"crc-storage-crc-t7gk4\" (UID: \"acb7d775-74d4-439c-81db-6353bd34cdfa\") " pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.876558 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/acb7d775-74d4-439c-81db-6353bd34cdfa-node-mnt\") pod \"crc-storage-crc-t7gk4\" (UID: \"acb7d775-74d4-439c-81db-6353bd34cdfa\") " pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.876823 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/acb7d775-74d4-439c-81db-6353bd34cdfa-node-mnt\") pod \"crc-storage-crc-t7gk4\" (UID: \"acb7d775-74d4-439c-81db-6353bd34cdfa\") " pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.877600 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/acb7d775-74d4-439c-81db-6353bd34cdfa-crc-storage\") pod \"crc-storage-crc-t7gk4\" (UID: \"acb7d775-74d4-439c-81db-6353bd34cdfa\") " pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.896721 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxc4z\" (UniqueName: \"kubernetes.io/projected/acb7d775-74d4-439c-81db-6353bd34cdfa-kube-api-access-nxc4z\") pod \"crc-storage-crc-t7gk4\" (UID: \"acb7d775-74d4-439c-81db-6353bd34cdfa\") " pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:19 crc kubenswrapper[4991]: I1006 08:32:19.967721 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:20 crc kubenswrapper[4991]: E1006 08:32:20.001310 4991 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-t7gk4_crc-storage_acb7d775-74d4-439c-81db-6353bd34cdfa_0(17579d6eb5bceaadb032375907c61019376895bca919d432f88d311ca20715db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 08:32:20 crc kubenswrapper[4991]: E1006 08:32:20.001396 4991 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-t7gk4_crc-storage_acb7d775-74d4-439c-81db-6353bd34cdfa_0(17579d6eb5bceaadb032375907c61019376895bca919d432f88d311ca20715db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:20 crc kubenswrapper[4991]: E1006 08:32:20.001419 4991 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-t7gk4_crc-storage_acb7d775-74d4-439c-81db-6353bd34cdfa_0(17579d6eb5bceaadb032375907c61019376895bca919d432f88d311ca20715db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:20 crc kubenswrapper[4991]: E1006 08:32:20.001469 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-t7gk4_crc-storage(acb7d775-74d4-439c-81db-6353bd34cdfa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-t7gk4_crc-storage(acb7d775-74d4-439c-81db-6353bd34cdfa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-t7gk4_crc-storage_acb7d775-74d4-439c-81db-6353bd34cdfa_0(17579d6eb5bceaadb032375907c61019376895bca919d432f88d311ca20715db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-t7gk4" podUID="acb7d775-74d4-439c-81db-6353bd34cdfa" Oct 06 08:32:20 crc kubenswrapper[4991]: I1006 08:32:20.436095 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:20 crc kubenswrapper[4991]: I1006 08:32:20.437629 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:20 crc kubenswrapper[4991]: E1006 08:32:20.463647 4991 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-t7gk4_crc-storage_acb7d775-74d4-439c-81db-6353bd34cdfa_0(ca60dfa74431c4c991fb60c99fd3fbf89ee89a6e960aff02dfefcb9b6e430e80): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 08:32:20 crc kubenswrapper[4991]: E1006 08:32:20.463730 4991 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-t7gk4_crc-storage_acb7d775-74d4-439c-81db-6353bd34cdfa_0(ca60dfa74431c4c991fb60c99fd3fbf89ee89a6e960aff02dfefcb9b6e430e80): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:20 crc kubenswrapper[4991]: E1006 08:32:20.463755 4991 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-t7gk4_crc-storage_acb7d775-74d4-439c-81db-6353bd34cdfa_0(ca60dfa74431c4c991fb60c99fd3fbf89ee89a6e960aff02dfefcb9b6e430e80): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:20 crc kubenswrapper[4991]: E1006 08:32:20.463813 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-t7gk4_crc-storage(acb7d775-74d4-439c-81db-6353bd34cdfa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-t7gk4_crc-storage(acb7d775-74d4-439c-81db-6353bd34cdfa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-t7gk4_crc-storage_acb7d775-74d4-439c-81db-6353bd34cdfa_0(ca60dfa74431c4c991fb60c99fd3fbf89ee89a6e960aff02dfefcb9b6e430e80): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-t7gk4" podUID="acb7d775-74d4-439c-81db-6353bd34cdfa" Oct 06 08:32:21 crc kubenswrapper[4991]: I1006 08:32:21.255589 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9333541d-b671-4a7b-b3a7-8d9646850c0d" path="/var/lib/kubelet/pods/9333541d-b671-4a7b-b3a7-8d9646850c0d/volumes" Oct 06 08:32:32 crc kubenswrapper[4991]: I1006 08:32:32.243606 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:32 crc kubenswrapper[4991]: I1006 08:32:32.245026 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:32 crc kubenswrapper[4991]: I1006 08:32:32.718099 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-t7gk4"] Oct 06 08:32:32 crc kubenswrapper[4991]: W1006 08:32:32.726895 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacb7d775_74d4_439c_81db_6353bd34cdfa.slice/crio-59415576cc1d7a65529b23977380ff67cc3ee3d242b5caff17a98243c8f884cf WatchSource:0}: Error finding container 59415576cc1d7a65529b23977380ff67cc3ee3d242b5caff17a98243c8f884cf: Status 404 returned error can't find the container with id 59415576cc1d7a65529b23977380ff67cc3ee3d242b5caff17a98243c8f884cf Oct 06 08:32:33 crc kubenswrapper[4991]: I1006 08:32:33.522172 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-t7gk4" event={"ID":"acb7d775-74d4-439c-81db-6353bd34cdfa","Type":"ContainerStarted","Data":"59415576cc1d7a65529b23977380ff67cc3ee3d242b5caff17a98243c8f884cf"} Oct 06 08:32:34 crc kubenswrapper[4991]: I1006 08:32:34.530814 4991 generic.go:334] "Generic (PLEG): container finished" podID="acb7d775-74d4-439c-81db-6353bd34cdfa" containerID="ada0d6a09f42ee6e8270e002ea17c661a45fdca16f498f19a36757f82986ee4c" exitCode=0 Oct 06 08:32:34 crc kubenswrapper[4991]: I1006 08:32:34.530888 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-t7gk4" event={"ID":"acb7d775-74d4-439c-81db-6353bd34cdfa","Type":"ContainerDied","Data":"ada0d6a09f42ee6e8270e002ea17c661a45fdca16f498f19a36757f82986ee4c"} Oct 06 08:32:35 crc kubenswrapper[4991]: I1006 08:32:35.805754 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:35 crc kubenswrapper[4991]: I1006 08:32:35.983769 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxc4z\" (UniqueName: \"kubernetes.io/projected/acb7d775-74d4-439c-81db-6353bd34cdfa-kube-api-access-nxc4z\") pod \"acb7d775-74d4-439c-81db-6353bd34cdfa\" (UID: \"acb7d775-74d4-439c-81db-6353bd34cdfa\") " Oct 06 08:32:35 crc kubenswrapper[4991]: I1006 08:32:35.983851 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/acb7d775-74d4-439c-81db-6353bd34cdfa-node-mnt\") pod \"acb7d775-74d4-439c-81db-6353bd34cdfa\" (UID: \"acb7d775-74d4-439c-81db-6353bd34cdfa\") " Oct 06 08:32:35 crc kubenswrapper[4991]: I1006 08:32:35.983935 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/acb7d775-74d4-439c-81db-6353bd34cdfa-crc-storage\") pod \"acb7d775-74d4-439c-81db-6353bd34cdfa\" (UID: \"acb7d775-74d4-439c-81db-6353bd34cdfa\") " Oct 06 08:32:35 crc kubenswrapper[4991]: I1006 08:32:35.983944 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acb7d775-74d4-439c-81db-6353bd34cdfa-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "acb7d775-74d4-439c-81db-6353bd34cdfa" (UID: "acb7d775-74d4-439c-81db-6353bd34cdfa"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:32:35 crc kubenswrapper[4991]: I1006 08:32:35.984284 4991 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/acb7d775-74d4-439c-81db-6353bd34cdfa-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:35 crc kubenswrapper[4991]: I1006 08:32:35.991559 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb7d775-74d4-439c-81db-6353bd34cdfa-kube-api-access-nxc4z" (OuterVolumeSpecName: "kube-api-access-nxc4z") pod "acb7d775-74d4-439c-81db-6353bd34cdfa" (UID: "acb7d775-74d4-439c-81db-6353bd34cdfa"). InnerVolumeSpecName "kube-api-access-nxc4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:32:36 crc kubenswrapper[4991]: I1006 08:32:36.006367 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acb7d775-74d4-439c-81db-6353bd34cdfa-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "acb7d775-74d4-439c-81db-6353bd34cdfa" (UID: "acb7d775-74d4-439c-81db-6353bd34cdfa"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:32:36 crc kubenswrapper[4991]: I1006 08:32:36.084938 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxc4z\" (UniqueName: \"kubernetes.io/projected/acb7d775-74d4-439c-81db-6353bd34cdfa-kube-api-access-nxc4z\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:36 crc kubenswrapper[4991]: I1006 08:32:36.085467 4991 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/acb7d775-74d4-439c-81db-6353bd34cdfa-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:36 crc kubenswrapper[4991]: I1006 08:32:36.545604 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-t7gk4" event={"ID":"acb7d775-74d4-439c-81db-6353bd34cdfa","Type":"ContainerDied","Data":"59415576cc1d7a65529b23977380ff67cc3ee3d242b5caff17a98243c8f884cf"} Oct 06 08:32:36 crc kubenswrapper[4991]: I1006 08:32:36.545647 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59415576cc1d7a65529b23977380ff67cc3ee3d242b5caff17a98243c8f884cf" Oct 06 08:32:36 crc kubenswrapper[4991]: I1006 08:32:36.545693 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7gk4" Oct 06 08:32:41 crc kubenswrapper[4991]: I1006 08:32:41.989176 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-prbg4" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.577227 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82"] Oct 06 08:32:44 crc kubenswrapper[4991]: E1006 08:32:44.577580 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb7d775-74d4-439c-81db-6353bd34cdfa" containerName="storage" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.577602 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb7d775-74d4-439c-81db-6353bd34cdfa" containerName="storage" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.577786 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb7d775-74d4-439c-81db-6353bd34cdfa" containerName="storage" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.579012 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.582955 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.601942 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82"] Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.693520 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d34a20f-3314-4dd9-aa32-75c53762962e-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82\" (UID: \"8d34a20f-3314-4dd9-aa32-75c53762962e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.693574 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d34a20f-3314-4dd9-aa32-75c53762962e-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82\" (UID: \"8d34a20f-3314-4dd9-aa32-75c53762962e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.693612 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d55m8\" (UniqueName: \"kubernetes.io/projected/8d34a20f-3314-4dd9-aa32-75c53762962e-kube-api-access-d55m8\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82\" (UID: \"8d34a20f-3314-4dd9-aa32-75c53762962e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.794394 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d34a20f-3314-4dd9-aa32-75c53762962e-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82\" (UID: \"8d34a20f-3314-4dd9-aa32-75c53762962e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.794444 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d55m8\" (UniqueName: \"kubernetes.io/projected/8d34a20f-3314-4dd9-aa32-75c53762962e-kube-api-access-d55m8\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82\" (UID: \"8d34a20f-3314-4dd9-aa32-75c53762962e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.794510 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d34a20f-3314-4dd9-aa32-75c53762962e-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82\" (UID: \"8d34a20f-3314-4dd9-aa32-75c53762962e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.794968 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d34a20f-3314-4dd9-aa32-75c53762962e-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82\" (UID: \"8d34a20f-3314-4dd9-aa32-75c53762962e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.795108 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d34a20f-3314-4dd9-aa32-75c53762962e-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82\" (UID: \"8d34a20f-3314-4dd9-aa32-75c53762962e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.826845 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d55m8\" (UniqueName: \"kubernetes.io/projected/8d34a20f-3314-4dd9-aa32-75c53762962e-kube-api-access-d55m8\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82\" (UID: \"8d34a20f-3314-4dd9-aa32-75c53762962e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:44 crc kubenswrapper[4991]: I1006 08:32:44.911351 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:45 crc kubenswrapper[4991]: I1006 08:32:45.179363 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82"] Oct 06 08:32:45 crc kubenswrapper[4991]: I1006 08:32:45.603177 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" event={"ID":"8d34a20f-3314-4dd9-aa32-75c53762962e","Type":"ContainerStarted","Data":"7ccfc0325081da28f2dbf17fd1eb7e22fccb207d78e561668d1b89d8a3a16170"} Oct 06 08:32:45 crc kubenswrapper[4991]: I1006 08:32:45.603247 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" event={"ID":"8d34a20f-3314-4dd9-aa32-75c53762962e","Type":"ContainerStarted","Data":"843cca66ff9abf1950527fd8838612f33a4dd42c314c034d84ba0ba22ee7835d"} Oct 06 08:32:46 crc kubenswrapper[4991]: I1006 08:32:46.610983 4991 generic.go:334] "Generic (PLEG): container finished" podID="8d34a20f-3314-4dd9-aa32-75c53762962e" containerID="7ccfc0325081da28f2dbf17fd1eb7e22fccb207d78e561668d1b89d8a3a16170" exitCode=0 Oct 06 08:32:46 crc kubenswrapper[4991]: I1006 08:32:46.611037 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" event={"ID":"8d34a20f-3314-4dd9-aa32-75c53762962e","Type":"ContainerDied","Data":"7ccfc0325081da28f2dbf17fd1eb7e22fccb207d78e561668d1b89d8a3a16170"} Oct 06 08:32:46 crc kubenswrapper[4991]: I1006 08:32:46.913234 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-66bs5"] Oct 06 08:32:46 crc kubenswrapper[4991]: I1006 08:32:46.921733 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:46 crc kubenswrapper[4991]: I1006 08:32:46.942004 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-66bs5"] Oct 06 08:32:47 crc kubenswrapper[4991]: I1006 08:32:47.026604 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f143de69-7751-4e66-8e57-f1d715c7f14e-utilities\") pod \"redhat-operators-66bs5\" (UID: \"f143de69-7751-4e66-8e57-f1d715c7f14e\") " pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:47 crc kubenswrapper[4991]: I1006 08:32:47.026696 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f143de69-7751-4e66-8e57-f1d715c7f14e-catalog-content\") pod \"redhat-operators-66bs5\" (UID: \"f143de69-7751-4e66-8e57-f1d715c7f14e\") " pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:47 crc kubenswrapper[4991]: I1006 08:32:47.026794 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv57d\" (UniqueName: \"kubernetes.io/projected/f143de69-7751-4e66-8e57-f1d715c7f14e-kube-api-access-zv57d\") pod \"redhat-operators-66bs5\" (UID: \"f143de69-7751-4e66-8e57-f1d715c7f14e\") " pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:47 crc kubenswrapper[4991]: I1006 08:32:47.127986 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f143de69-7751-4e66-8e57-f1d715c7f14e-utilities\") pod \"redhat-operators-66bs5\" (UID: \"f143de69-7751-4e66-8e57-f1d715c7f14e\") " pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:47 crc kubenswrapper[4991]: I1006 08:32:47.128059 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f143de69-7751-4e66-8e57-f1d715c7f14e-catalog-content\") pod \"redhat-operators-66bs5\" (UID: \"f143de69-7751-4e66-8e57-f1d715c7f14e\") " pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:47 crc kubenswrapper[4991]: I1006 08:32:47.128119 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv57d\" (UniqueName: \"kubernetes.io/projected/f143de69-7751-4e66-8e57-f1d715c7f14e-kube-api-access-zv57d\") pod \"redhat-operators-66bs5\" (UID: \"f143de69-7751-4e66-8e57-f1d715c7f14e\") " pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:47 crc kubenswrapper[4991]: I1006 08:32:47.128718 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f143de69-7751-4e66-8e57-f1d715c7f14e-utilities\") pod \"redhat-operators-66bs5\" (UID: \"f143de69-7751-4e66-8e57-f1d715c7f14e\") " pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:47 crc kubenswrapper[4991]: I1006 08:32:47.128768 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f143de69-7751-4e66-8e57-f1d715c7f14e-catalog-content\") pod \"redhat-operators-66bs5\" (UID: \"f143de69-7751-4e66-8e57-f1d715c7f14e\") " pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:47 crc kubenswrapper[4991]: I1006 08:32:47.149367 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv57d\" (UniqueName: \"kubernetes.io/projected/f143de69-7751-4e66-8e57-f1d715c7f14e-kube-api-access-zv57d\") pod \"redhat-operators-66bs5\" (UID: \"f143de69-7751-4e66-8e57-f1d715c7f14e\") " pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:47 crc kubenswrapper[4991]: I1006 08:32:47.258651 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:47 crc kubenswrapper[4991]: I1006 08:32:47.693086 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-66bs5"] Oct 06 08:32:47 crc kubenswrapper[4991]: W1006 08:32:47.790319 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf143de69_7751_4e66_8e57_f1d715c7f14e.slice/crio-ea2fbae9a26ab63199c0a0c018d40223cc954088d5d15f29f733b8793596f6c6 WatchSource:0}: Error finding container ea2fbae9a26ab63199c0a0c018d40223cc954088d5d15f29f733b8793596f6c6: Status 404 returned error can't find the container with id ea2fbae9a26ab63199c0a0c018d40223cc954088d5d15f29f733b8793596f6c6 Oct 06 08:32:48 crc kubenswrapper[4991]: I1006 08:32:48.624674 4991 generic.go:334] "Generic (PLEG): container finished" podID="f143de69-7751-4e66-8e57-f1d715c7f14e" containerID="86fdf92ae7900c105ef312eabafbbee70391c0b5021f9e1f4f669b436d0696e0" exitCode=0 Oct 06 08:32:48 crc kubenswrapper[4991]: I1006 08:32:48.624745 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66bs5" event={"ID":"f143de69-7751-4e66-8e57-f1d715c7f14e","Type":"ContainerDied","Data":"86fdf92ae7900c105ef312eabafbbee70391c0b5021f9e1f4f669b436d0696e0"} Oct 06 08:32:48 crc kubenswrapper[4991]: I1006 08:32:48.625366 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66bs5" event={"ID":"f143de69-7751-4e66-8e57-f1d715c7f14e","Type":"ContainerStarted","Data":"ea2fbae9a26ab63199c0a0c018d40223cc954088d5d15f29f733b8793596f6c6"} Oct 06 08:32:48 crc kubenswrapper[4991]: I1006 08:32:48.629630 4991 generic.go:334] "Generic (PLEG): container finished" podID="8d34a20f-3314-4dd9-aa32-75c53762962e" containerID="5db1f82b13e581781398e42a79b06980791cdd61fecda12fd28960d9ed11e739" exitCode=0 Oct 06 08:32:48 crc kubenswrapper[4991]: I1006 08:32:48.629672 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" event={"ID":"8d34a20f-3314-4dd9-aa32-75c53762962e","Type":"ContainerDied","Data":"5db1f82b13e581781398e42a79b06980791cdd61fecda12fd28960d9ed11e739"} Oct 06 08:32:49 crc kubenswrapper[4991]: I1006 08:32:49.638834 4991 generic.go:334] "Generic (PLEG): container finished" podID="8d34a20f-3314-4dd9-aa32-75c53762962e" containerID="1f745770e8fd9d36c6da6cc22433567451e0fa7c7e39d052c14bf6ace7e11136" exitCode=0 Oct 06 08:32:49 crc kubenswrapper[4991]: I1006 08:32:49.638894 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" event={"ID":"8d34a20f-3314-4dd9-aa32-75c53762962e","Type":"ContainerDied","Data":"1f745770e8fd9d36c6da6cc22433567451e0fa7c7e39d052c14bf6ace7e11136"} Oct 06 08:32:49 crc kubenswrapper[4991]: I1006 08:32:49.646238 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66bs5" event={"ID":"f143de69-7751-4e66-8e57-f1d715c7f14e","Type":"ContainerStarted","Data":"8d6b02095abd4018fdd946203ced29bf2d78b44d2225cfb4de401f70e6541c84"} Oct 06 08:32:50 crc kubenswrapper[4991]: I1006 08:32:50.659707 4991 generic.go:334] "Generic (PLEG): container finished" podID="f143de69-7751-4e66-8e57-f1d715c7f14e" containerID="8d6b02095abd4018fdd946203ced29bf2d78b44d2225cfb4de401f70e6541c84" exitCode=0 Oct 06 08:32:50 crc kubenswrapper[4991]: I1006 08:32:50.660635 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66bs5" event={"ID":"f143de69-7751-4e66-8e57-f1d715c7f14e","Type":"ContainerDied","Data":"8d6b02095abd4018fdd946203ced29bf2d78b44d2225cfb4de401f70e6541c84"} Oct 06 08:32:50 crc kubenswrapper[4991]: I1006 08:32:50.962688 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:50 crc kubenswrapper[4991]: I1006 08:32:50.981193 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d34a20f-3314-4dd9-aa32-75c53762962e-util\") pod \"8d34a20f-3314-4dd9-aa32-75c53762962e\" (UID: \"8d34a20f-3314-4dd9-aa32-75c53762962e\") " Oct 06 08:32:50 crc kubenswrapper[4991]: I1006 08:32:50.981493 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d55m8\" (UniqueName: \"kubernetes.io/projected/8d34a20f-3314-4dd9-aa32-75c53762962e-kube-api-access-d55m8\") pod \"8d34a20f-3314-4dd9-aa32-75c53762962e\" (UID: \"8d34a20f-3314-4dd9-aa32-75c53762962e\") " Oct 06 08:32:50 crc kubenswrapper[4991]: I1006 08:32:50.981749 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d34a20f-3314-4dd9-aa32-75c53762962e-bundle\") pod \"8d34a20f-3314-4dd9-aa32-75c53762962e\" (UID: \"8d34a20f-3314-4dd9-aa32-75c53762962e\") " Oct 06 08:32:50 crc kubenswrapper[4991]: I1006 08:32:50.982428 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d34a20f-3314-4dd9-aa32-75c53762962e-bundle" (OuterVolumeSpecName: "bundle") pod "8d34a20f-3314-4dd9-aa32-75c53762962e" (UID: "8d34a20f-3314-4dd9-aa32-75c53762962e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:32:50 crc kubenswrapper[4991]: I1006 08:32:50.982601 4991 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d34a20f-3314-4dd9-aa32-75c53762962e-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:50 crc kubenswrapper[4991]: I1006 08:32:50.987764 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d34a20f-3314-4dd9-aa32-75c53762962e-kube-api-access-d55m8" (OuterVolumeSpecName: "kube-api-access-d55m8") pod "8d34a20f-3314-4dd9-aa32-75c53762962e" (UID: "8d34a20f-3314-4dd9-aa32-75c53762962e"). InnerVolumeSpecName "kube-api-access-d55m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:32:51 crc kubenswrapper[4991]: I1006 08:32:51.006536 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d34a20f-3314-4dd9-aa32-75c53762962e-util" (OuterVolumeSpecName: "util") pod "8d34a20f-3314-4dd9-aa32-75c53762962e" (UID: "8d34a20f-3314-4dd9-aa32-75c53762962e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:32:51 crc kubenswrapper[4991]: I1006 08:32:51.083668 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d55m8\" (UniqueName: \"kubernetes.io/projected/8d34a20f-3314-4dd9-aa32-75c53762962e-kube-api-access-d55m8\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:51 crc kubenswrapper[4991]: I1006 08:32:51.083713 4991 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d34a20f-3314-4dd9-aa32-75c53762962e-util\") on node \"crc\" DevicePath \"\"" Oct 06 08:32:51 crc kubenswrapper[4991]: I1006 08:32:51.670996 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" event={"ID":"8d34a20f-3314-4dd9-aa32-75c53762962e","Type":"ContainerDied","Data":"843cca66ff9abf1950527fd8838612f33a4dd42c314c034d84ba0ba22ee7835d"} Oct 06 08:32:51 crc kubenswrapper[4991]: I1006 08:32:51.671063 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="843cca66ff9abf1950527fd8838612f33a4dd42c314c034d84ba0ba22ee7835d" Oct 06 08:32:51 crc kubenswrapper[4991]: I1006 08:32:51.671067 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82" Oct 06 08:32:51 crc kubenswrapper[4991]: I1006 08:32:51.679083 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66bs5" event={"ID":"f143de69-7751-4e66-8e57-f1d715c7f14e","Type":"ContainerStarted","Data":"a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c"} Oct 06 08:32:51 crc kubenswrapper[4991]: I1006 08:32:51.713338 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-66bs5" podStartSLOduration=3.149051534 podStartE2EDuration="5.713269244s" podCreationTimestamp="2025-10-06 08:32:46 +0000 UTC" firstStartedPulling="2025-10-06 08:32:48.626796077 +0000 UTC m=+820.364546098" lastFinishedPulling="2025-10-06 08:32:51.191013787 +0000 UTC m=+822.928763808" observedRunningTime="2025-10-06 08:32:51.705417909 +0000 UTC m=+823.443167940" watchObservedRunningTime="2025-10-06 08:32:51.713269244 +0000 UTC m=+823.451019315" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.070177 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-mx8tg"] Oct 06 08:32:55 crc kubenswrapper[4991]: E1006 08:32:55.070458 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d34a20f-3314-4dd9-aa32-75c53762962e" containerName="extract" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.070476 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d34a20f-3314-4dd9-aa32-75c53762962e" containerName="extract" Oct 06 08:32:55 crc kubenswrapper[4991]: E1006 08:32:55.070506 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d34a20f-3314-4dd9-aa32-75c53762962e" containerName="pull" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.070514 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d34a20f-3314-4dd9-aa32-75c53762962e" containerName="pull" Oct 06 08:32:55 crc kubenswrapper[4991]: E1006 08:32:55.070527 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d34a20f-3314-4dd9-aa32-75c53762962e" containerName="util" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.070534 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d34a20f-3314-4dd9-aa32-75c53762962e" containerName="util" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.070643 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d34a20f-3314-4dd9-aa32-75c53762962e" containerName="extract" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.071089 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mx8tg" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.073841 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-trh5g" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.073975 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.073975 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.084659 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-mx8tg"] Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.232788 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf5t6\" (UniqueName: \"kubernetes.io/projected/7c58f931-7306-45f7-a983-134a70c9952a-kube-api-access-gf5t6\") pod \"nmstate-operator-858ddd8f98-mx8tg\" (UID: \"7c58f931-7306-45f7-a983-134a70c9952a\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-mx8tg" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.333821 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf5t6\" (UniqueName: \"kubernetes.io/projected/7c58f931-7306-45f7-a983-134a70c9952a-kube-api-access-gf5t6\") pod \"nmstate-operator-858ddd8f98-mx8tg\" (UID: \"7c58f931-7306-45f7-a983-134a70c9952a\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-mx8tg" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.355827 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf5t6\" (UniqueName: \"kubernetes.io/projected/7c58f931-7306-45f7-a983-134a70c9952a-kube-api-access-gf5t6\") pod \"nmstate-operator-858ddd8f98-mx8tg\" (UID: \"7c58f931-7306-45f7-a983-134a70c9952a\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-mx8tg" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.386928 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mx8tg" Oct 06 08:32:55 crc kubenswrapper[4991]: I1006 08:32:55.804433 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-mx8tg"] Oct 06 08:32:56 crc kubenswrapper[4991]: I1006 08:32:56.709725 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mx8tg" event={"ID":"7c58f931-7306-45f7-a983-134a70c9952a","Type":"ContainerStarted","Data":"d2afda3ce427e019fb2e088b89bc37528315e8793a73cf3351025526ca7a2eff"} Oct 06 08:32:57 crc kubenswrapper[4991]: I1006 08:32:57.258808 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:57 crc kubenswrapper[4991]: I1006 08:32:57.259281 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:57 crc kubenswrapper[4991]: I1006 08:32:57.306748 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:57 crc kubenswrapper[4991]: I1006 08:32:57.771075 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:32:58 crc kubenswrapper[4991]: I1006 08:32:58.725447 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mx8tg" event={"ID":"7c58f931-7306-45f7-a983-134a70c9952a","Type":"ContainerStarted","Data":"9f6c75a816ee15b92aa9788b7537f11ca484a3372035db008036ad70ebe85923"} Oct 06 08:32:58 crc kubenswrapper[4991]: I1006 08:32:58.779370 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-mx8tg" podStartSLOduration=2.022324902 podStartE2EDuration="3.779346678s" podCreationTimestamp="2025-10-06 08:32:55 +0000 UTC" firstStartedPulling="2025-10-06 08:32:55.815784186 +0000 UTC m=+827.553534217" lastFinishedPulling="2025-10-06 08:32:57.572805972 +0000 UTC m=+829.310555993" observedRunningTime="2025-10-06 08:32:58.772700207 +0000 UTC m=+830.510450268" watchObservedRunningTime="2025-10-06 08:32:58.779346678 +0000 UTC m=+830.517096739" Oct 06 08:32:59 crc kubenswrapper[4991]: I1006 08:32:59.905252 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-66bs5"] Oct 06 08:33:00 crc kubenswrapper[4991]: I1006 08:33:00.738626 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-66bs5" podUID="f143de69-7751-4e66-8e57-f1d715c7f14e" containerName="registry-server" containerID="cri-o://a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c" gracePeriod=2 Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.203275 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.310039 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f143de69-7751-4e66-8e57-f1d715c7f14e-utilities\") pod \"f143de69-7751-4e66-8e57-f1d715c7f14e\" (UID: \"f143de69-7751-4e66-8e57-f1d715c7f14e\") " Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.310091 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f143de69-7751-4e66-8e57-f1d715c7f14e-catalog-content\") pod \"f143de69-7751-4e66-8e57-f1d715c7f14e\" (UID: \"f143de69-7751-4e66-8e57-f1d715c7f14e\") " Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.310147 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv57d\" (UniqueName: \"kubernetes.io/projected/f143de69-7751-4e66-8e57-f1d715c7f14e-kube-api-access-zv57d\") pod \"f143de69-7751-4e66-8e57-f1d715c7f14e\" (UID: \"f143de69-7751-4e66-8e57-f1d715c7f14e\") " Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.310882 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f143de69-7751-4e66-8e57-f1d715c7f14e-utilities" (OuterVolumeSpecName: "utilities") pod "f143de69-7751-4e66-8e57-f1d715c7f14e" (UID: "f143de69-7751-4e66-8e57-f1d715c7f14e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.315625 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f143de69-7751-4e66-8e57-f1d715c7f14e-kube-api-access-zv57d" (OuterVolumeSpecName: "kube-api-access-zv57d") pod "f143de69-7751-4e66-8e57-f1d715c7f14e" (UID: "f143de69-7751-4e66-8e57-f1d715c7f14e"). InnerVolumeSpecName "kube-api-access-zv57d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.409180 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f143de69-7751-4e66-8e57-f1d715c7f14e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f143de69-7751-4e66-8e57-f1d715c7f14e" (UID: "f143de69-7751-4e66-8e57-f1d715c7f14e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.411126 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f143de69-7751-4e66-8e57-f1d715c7f14e-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.411159 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f143de69-7751-4e66-8e57-f1d715c7f14e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.411170 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv57d\" (UniqueName: \"kubernetes.io/projected/f143de69-7751-4e66-8e57-f1d715c7f14e-kube-api-access-zv57d\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.746637 4991 generic.go:334] "Generic (PLEG): container finished" podID="f143de69-7751-4e66-8e57-f1d715c7f14e" containerID="a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c" exitCode=0 Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.746683 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66bs5" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.746709 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66bs5" event={"ID":"f143de69-7751-4e66-8e57-f1d715c7f14e","Type":"ContainerDied","Data":"a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c"} Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.746773 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66bs5" event={"ID":"f143de69-7751-4e66-8e57-f1d715c7f14e","Type":"ContainerDied","Data":"ea2fbae9a26ab63199c0a0c018d40223cc954088d5d15f29f733b8793596f6c6"} Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.746812 4991 scope.go:117] "RemoveContainer" containerID="a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.769083 4991 scope.go:117] "RemoveContainer" containerID="8d6b02095abd4018fdd946203ced29bf2d78b44d2225cfb4de401f70e6541c84" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.782048 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-66bs5"] Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.786672 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-66bs5"] Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.811962 4991 scope.go:117] "RemoveContainer" containerID="86fdf92ae7900c105ef312eabafbbee70391c0b5021f9e1f4f669b436d0696e0" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.834412 4991 scope.go:117] "RemoveContainer" containerID="a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c" Oct 06 08:33:01 crc kubenswrapper[4991]: E1006 08:33:01.834843 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c\": container with ID starting with a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c not found: ID does not exist" containerID="a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.834882 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c"} err="failed to get container status \"a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c\": rpc error: code = NotFound desc = could not find container \"a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c\": container with ID starting with a0c0b8ca539c5ee72fedcbd8d346281aaa4860c921d6904e79f87bf349ec591c not found: ID does not exist" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.834908 4991 scope.go:117] "RemoveContainer" containerID="8d6b02095abd4018fdd946203ced29bf2d78b44d2225cfb4de401f70e6541c84" Oct 06 08:33:01 crc kubenswrapper[4991]: E1006 08:33:01.835355 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6b02095abd4018fdd946203ced29bf2d78b44d2225cfb4de401f70e6541c84\": container with ID starting with 8d6b02095abd4018fdd946203ced29bf2d78b44d2225cfb4de401f70e6541c84 not found: ID does not exist" containerID="8d6b02095abd4018fdd946203ced29bf2d78b44d2225cfb4de401f70e6541c84" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.835376 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6b02095abd4018fdd946203ced29bf2d78b44d2225cfb4de401f70e6541c84"} err="failed to get container status \"8d6b02095abd4018fdd946203ced29bf2d78b44d2225cfb4de401f70e6541c84\": rpc error: code = NotFound desc = could not find container \"8d6b02095abd4018fdd946203ced29bf2d78b44d2225cfb4de401f70e6541c84\": container with ID starting with 8d6b02095abd4018fdd946203ced29bf2d78b44d2225cfb4de401f70e6541c84 not found: ID does not exist" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.835389 4991 scope.go:117] "RemoveContainer" containerID="86fdf92ae7900c105ef312eabafbbee70391c0b5021f9e1f4f669b436d0696e0" Oct 06 08:33:01 crc kubenswrapper[4991]: E1006 08:33:01.835857 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86fdf92ae7900c105ef312eabafbbee70391c0b5021f9e1f4f669b436d0696e0\": container with ID starting with 86fdf92ae7900c105ef312eabafbbee70391c0b5021f9e1f4f669b436d0696e0 not found: ID does not exist" containerID="86fdf92ae7900c105ef312eabafbbee70391c0b5021f9e1f4f669b436d0696e0" Oct 06 08:33:01 crc kubenswrapper[4991]: I1006 08:33:01.835897 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86fdf92ae7900c105ef312eabafbbee70391c0b5021f9e1f4f669b436d0696e0"} err="failed to get container status \"86fdf92ae7900c105ef312eabafbbee70391c0b5021f9e1f4f669b436d0696e0\": rpc error: code = NotFound desc = could not find container \"86fdf92ae7900c105ef312eabafbbee70391c0b5021f9e1f4f669b436d0696e0\": container with ID starting with 86fdf92ae7900c105ef312eabafbbee70391c0b5021f9e1f4f669b436d0696e0 not found: ID does not exist" Oct 06 08:33:03 crc kubenswrapper[4991]: I1006 08:33:03.256994 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f143de69-7751-4e66-8e57-f1d715c7f14e" path="/var/lib/kubelet/pods/f143de69-7751-4e66-8e57-f1d715c7f14e/volumes" Oct 06 08:33:04 crc kubenswrapper[4991]: I1006 08:33:04.914858 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hhw29"] Oct 06 08:33:04 crc kubenswrapper[4991]: E1006 08:33:04.915150 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f143de69-7751-4e66-8e57-f1d715c7f14e" containerName="registry-server" Oct 06 08:33:04 crc kubenswrapper[4991]: I1006 08:33:04.915171 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f143de69-7751-4e66-8e57-f1d715c7f14e" containerName="registry-server" Oct 06 08:33:04 crc kubenswrapper[4991]: E1006 08:33:04.915188 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f143de69-7751-4e66-8e57-f1d715c7f14e" containerName="extract-content" Oct 06 08:33:04 crc kubenswrapper[4991]: I1006 08:33:04.915198 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f143de69-7751-4e66-8e57-f1d715c7f14e" containerName="extract-content" Oct 06 08:33:04 crc kubenswrapper[4991]: E1006 08:33:04.915223 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f143de69-7751-4e66-8e57-f1d715c7f14e" containerName="extract-utilities" Oct 06 08:33:04 crc kubenswrapper[4991]: I1006 08:33:04.915236 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f143de69-7751-4e66-8e57-f1d715c7f14e" containerName="extract-utilities" Oct 06 08:33:04 crc kubenswrapper[4991]: I1006 08:33:04.915392 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f143de69-7751-4e66-8e57-f1d715c7f14e" containerName="registry-server" Oct 06 08:33:04 crc kubenswrapper[4991]: I1006 08:33:04.916519 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:04 crc kubenswrapper[4991]: I1006 08:33:04.929444 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhw29"] Oct 06 08:33:04 crc kubenswrapper[4991]: I1006 08:33:04.953993 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c1c66a-36eb-4fad-ae43-515856947e7e-catalog-content\") pod \"community-operators-hhw29\" (UID: \"16c1c66a-36eb-4fad-ae43-515856947e7e\") " pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:04 crc kubenswrapper[4991]: I1006 08:33:04.954118 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c1c66a-36eb-4fad-ae43-515856947e7e-utilities\") pod \"community-operators-hhw29\" (UID: \"16c1c66a-36eb-4fad-ae43-515856947e7e\") " pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:04 crc kubenswrapper[4991]: I1006 08:33:04.954173 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bnlz\" (UniqueName: \"kubernetes.io/projected/16c1c66a-36eb-4fad-ae43-515856947e7e-kube-api-access-2bnlz\") pod \"community-operators-hhw29\" (UID: \"16c1c66a-36eb-4fad-ae43-515856947e7e\") " pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.055041 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c1c66a-36eb-4fad-ae43-515856947e7e-utilities\") pod \"community-operators-hhw29\" (UID: \"16c1c66a-36eb-4fad-ae43-515856947e7e\") " pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.055117 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bnlz\" (UniqueName: \"kubernetes.io/projected/16c1c66a-36eb-4fad-ae43-515856947e7e-kube-api-access-2bnlz\") pod \"community-operators-hhw29\" (UID: \"16c1c66a-36eb-4fad-ae43-515856947e7e\") " pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.055157 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c1c66a-36eb-4fad-ae43-515856947e7e-catalog-content\") pod \"community-operators-hhw29\" (UID: \"16c1c66a-36eb-4fad-ae43-515856947e7e\") " pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.055647 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c1c66a-36eb-4fad-ae43-515856947e7e-utilities\") pod \"community-operators-hhw29\" (UID: \"16c1c66a-36eb-4fad-ae43-515856947e7e\") " pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.055727 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c1c66a-36eb-4fad-ae43-515856947e7e-catalog-content\") pod \"community-operators-hhw29\" (UID: \"16c1c66a-36eb-4fad-ae43-515856947e7e\") " pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.061156 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-2l665"] Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.062150 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2l665" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.067791 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm"] Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.068649 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.070728 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.072289 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xnnzq" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.077738 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bnlz\" (UniqueName: \"kubernetes.io/projected/16c1c66a-36eb-4fad-ae43-515856947e7e-kube-api-access-2bnlz\") pod \"community-operators-hhw29\" (UID: \"16c1c66a-36eb-4fad-ae43-515856947e7e\") " pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.084084 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-2l665"] Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.094252 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm"] Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.096400 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jzjv8"] Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.097312 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.200684 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w"] Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.201419 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.203346 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.203551 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-n4b26" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.203738 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.219614 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w"] Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.231430 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.258347 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7a7a3abc-344b-429e-a4eb-d62138e60de4-nmstate-lock\") pod \"nmstate-handler-jzjv8\" (UID: \"7a7a3abc-344b-429e-a4eb-d62138e60de4\") " pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.258577 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmcr\" (UniqueName: \"kubernetes.io/projected/7a7a3abc-344b-429e-a4eb-d62138e60de4-kube-api-access-swmcr\") pod \"nmstate-handler-jzjv8\" (UID: \"7a7a3abc-344b-429e-a4eb-d62138e60de4\") " pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.258606 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8kgw\" (UniqueName: \"kubernetes.io/projected/2c7406a6-af30-4f22-b109-9ea7e8cc2efe-kube-api-access-v8kgw\") pod \"nmstate-webhook-6cdbc54649-2nqhm\" (UID: \"2c7406a6-af30-4f22-b109-9ea7e8cc2efe\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.258675 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jbzs\" (UniqueName: \"kubernetes.io/projected/878e15d5-5337-4289-b425-82955b0b6b38-kube-api-access-8jbzs\") pod \"nmstate-metrics-fdff9cb8d-2l665\" (UID: \"878e15d5-5337-4289-b425-82955b0b6b38\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2l665" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.258701 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2c7406a6-af30-4f22-b109-9ea7e8cc2efe-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-2nqhm\" (UID: \"2c7406a6-af30-4f22-b109-9ea7e8cc2efe\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.258757 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7a7a3abc-344b-429e-a4eb-d62138e60de4-ovs-socket\") pod \"nmstate-handler-jzjv8\" (UID: \"7a7a3abc-344b-429e-a4eb-d62138e60de4\") " pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.258798 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7a7a3abc-344b-429e-a4eb-d62138e60de4-dbus-socket\") pod \"nmstate-handler-jzjv8\" (UID: \"7a7a3abc-344b-429e-a4eb-d62138e60de4\") " pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.361534 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7a7a3abc-344b-429e-a4eb-d62138e60de4-nmstate-lock\") pod \"nmstate-handler-jzjv8\" (UID: \"7a7a3abc-344b-429e-a4eb-d62138e60de4\") " pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.361581 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swmcr\" (UniqueName: \"kubernetes.io/projected/7a7a3abc-344b-429e-a4eb-d62138e60de4-kube-api-access-swmcr\") pod \"nmstate-handler-jzjv8\" (UID: \"7a7a3abc-344b-429e-a4eb-d62138e60de4\") " pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.361610 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8kgw\" (UniqueName: \"kubernetes.io/projected/2c7406a6-af30-4f22-b109-9ea7e8cc2efe-kube-api-access-v8kgw\") pod \"nmstate-webhook-6cdbc54649-2nqhm\" (UID: \"2c7406a6-af30-4f22-b109-9ea7e8cc2efe\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.361639 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7z9g\" (UniqueName: \"kubernetes.io/projected/03389c9a-9320-4556-8ddb-77e061a1a6c8-kube-api-access-m7z9g\") pod \"nmstate-console-plugin-6b874cbd85-m6d2w\" (UID: \"03389c9a-9320-4556-8ddb-77e061a1a6c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.361694 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jbzs\" (UniqueName: \"kubernetes.io/projected/878e15d5-5337-4289-b425-82955b0b6b38-kube-api-access-8jbzs\") pod \"nmstate-metrics-fdff9cb8d-2l665\" (UID: \"878e15d5-5337-4289-b425-82955b0b6b38\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2l665" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.361741 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2c7406a6-af30-4f22-b109-9ea7e8cc2efe-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-2nqhm\" (UID: \"2c7406a6-af30-4f22-b109-9ea7e8cc2efe\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.361764 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/03389c9a-9320-4556-8ddb-77e061a1a6c8-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-m6d2w\" (UID: \"03389c9a-9320-4556-8ddb-77e061a1a6c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.361795 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7a7a3abc-344b-429e-a4eb-d62138e60de4-ovs-socket\") pod \"nmstate-handler-jzjv8\" (UID: \"7a7a3abc-344b-429e-a4eb-d62138e60de4\") " pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.361820 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/03389c9a-9320-4556-8ddb-77e061a1a6c8-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-m6d2w\" (UID: \"03389c9a-9320-4556-8ddb-77e061a1a6c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.361850 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7a7a3abc-344b-429e-a4eb-d62138e60de4-dbus-socket\") pod \"nmstate-handler-jzjv8\" (UID: \"7a7a3abc-344b-429e-a4eb-d62138e60de4\") " pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.362158 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7a7a3abc-344b-429e-a4eb-d62138e60de4-nmstate-lock\") pod \"nmstate-handler-jzjv8\" (UID: \"7a7a3abc-344b-429e-a4eb-d62138e60de4\") " pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.362658 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7a7a3abc-344b-429e-a4eb-d62138e60de4-ovs-socket\") pod \"nmstate-handler-jzjv8\" (UID: \"7a7a3abc-344b-429e-a4eb-d62138e60de4\") " pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.362176 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7a7a3abc-344b-429e-a4eb-d62138e60de4-dbus-socket\") pod \"nmstate-handler-jzjv8\" (UID: \"7a7a3abc-344b-429e-a4eb-d62138e60de4\") " pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.388543 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8kgw\" (UniqueName: \"kubernetes.io/projected/2c7406a6-af30-4f22-b109-9ea7e8cc2efe-kube-api-access-v8kgw\") pod \"nmstate-webhook-6cdbc54649-2nqhm\" (UID: \"2c7406a6-af30-4f22-b109-9ea7e8cc2efe\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.394186 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmcr\" (UniqueName: \"kubernetes.io/projected/7a7a3abc-344b-429e-a4eb-d62138e60de4-kube-api-access-swmcr\") pod \"nmstate-handler-jzjv8\" (UID: \"7a7a3abc-344b-429e-a4eb-d62138e60de4\") " pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.394666 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2c7406a6-af30-4f22-b109-9ea7e8cc2efe-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-2nqhm\" (UID: \"2c7406a6-af30-4f22-b109-9ea7e8cc2efe\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.401715 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jbzs\" (UniqueName: \"kubernetes.io/projected/878e15d5-5337-4289-b425-82955b0b6b38-kube-api-access-8jbzs\") pod \"nmstate-metrics-fdff9cb8d-2l665\" (UID: \"878e15d5-5337-4289-b425-82955b0b6b38\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2l665" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.420608 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.435339 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.465880 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7z9g\" (UniqueName: \"kubernetes.io/projected/03389c9a-9320-4556-8ddb-77e061a1a6c8-kube-api-access-m7z9g\") pod \"nmstate-console-plugin-6b874cbd85-m6d2w\" (UID: \"03389c9a-9320-4556-8ddb-77e061a1a6c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.465936 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/03389c9a-9320-4556-8ddb-77e061a1a6c8-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-m6d2w\" (UID: \"03389c9a-9320-4556-8ddb-77e061a1a6c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.465967 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/03389c9a-9320-4556-8ddb-77e061a1a6c8-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-m6d2w\" (UID: \"03389c9a-9320-4556-8ddb-77e061a1a6c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.466769 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/03389c9a-9320-4556-8ddb-77e061a1a6c8-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-m6d2w\" (UID: \"03389c9a-9320-4556-8ddb-77e061a1a6c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" Oct 06 08:33:05 crc kubenswrapper[4991]: E1006 08:33:05.467033 4991 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 06 08:33:05 crc kubenswrapper[4991]: E1006 08:33:05.467068 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03389c9a-9320-4556-8ddb-77e061a1a6c8-plugin-serving-cert podName:03389c9a-9320-4556-8ddb-77e061a1a6c8 nodeName:}" failed. No retries permitted until 2025-10-06 08:33:05.967056053 +0000 UTC m=+837.704806074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/03389c9a-9320-4556-8ddb-77e061a1a6c8-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-m6d2w" (UID: "03389c9a-9320-4556-8ddb-77e061a1a6c8") : secret "plugin-serving-cert" not found Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.497261 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7z9g\" (UniqueName: \"kubernetes.io/projected/03389c9a-9320-4556-8ddb-77e061a1a6c8-kube-api-access-m7z9g\") pod \"nmstate-console-plugin-6b874cbd85-m6d2w\" (UID: \"03389c9a-9320-4556-8ddb-77e061a1a6c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.501850 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5488d6bb46-jlx2g"] Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.502502 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.506077 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5488d6bb46-jlx2g"] Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.566781 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-console-serving-cert\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.566837 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-service-ca\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.566857 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-oauth-serving-cert\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.566873 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-trusted-ca-bundle\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.566909 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-console-config\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.566933 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp5x4\" (UniqueName: \"kubernetes.io/projected/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-kube-api-access-mp5x4\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.566950 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-console-oauth-config\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.672251 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-console-oauth-config\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.672349 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-console-serving-cert\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.672376 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-service-ca\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.672394 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-oauth-serving-cert\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.672411 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-trusted-ca-bundle\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.672443 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-console-config\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.672469 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp5x4\" (UniqueName: \"kubernetes.io/projected/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-kube-api-access-mp5x4\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.673851 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-oauth-serving-cert\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.674141 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-trusted-ca-bundle\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.674690 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-service-ca\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.675960 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-console-config\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.676446 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-console-oauth-config\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.679760 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-console-serving-cert\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.696960 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp5x4\" (UniqueName: \"kubernetes.io/projected/e20c83c6-725c-4aa0-bd20-ff3ac632dd8d-kube-api-access-mp5x4\") pod \"console-5488d6bb46-jlx2g\" (UID: \"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d\") " pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.701562 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2l665" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.771945 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hhw29"] Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.772970 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jzjv8" event={"ID":"7a7a3abc-344b-429e-a4eb-d62138e60de4","Type":"ContainerStarted","Data":"5e365b7c94512022f0c5da21b4db0040e395a5a9e48b16db7703a3cfd7c50e5d"} Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.830990 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.975162 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/03389c9a-9320-4556-8ddb-77e061a1a6c8-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-m6d2w\" (UID: \"03389c9a-9320-4556-8ddb-77e061a1a6c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" Oct 06 08:33:05 crc kubenswrapper[4991]: I1006 08:33:05.983328 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/03389c9a-9320-4556-8ddb-77e061a1a6c8-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-m6d2w\" (UID: \"03389c9a-9320-4556-8ddb-77e061a1a6c8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.010409 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm"] Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.051997 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5488d6bb46-jlx2g"] Oct 06 08:33:06 crc kubenswrapper[4991]: W1006 08:33:06.057911 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode20c83c6_725c_4aa0_bd20_ff3ac632dd8d.slice/crio-0ef5b6f87629be032cac77c8f75b9b8afdee38cd9da6b3b135d2678ed9f2324a WatchSource:0}: Error finding container 0ef5b6f87629be032cac77c8f75b9b8afdee38cd9da6b3b135d2678ed9f2324a: Status 404 returned error can't find the container with id 0ef5b6f87629be032cac77c8f75b9b8afdee38cd9da6b3b135d2678ed9f2324a Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.117732 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.172601 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-2l665"] Oct 06 08:33:06 crc kubenswrapper[4991]: W1006 08:33:06.193102 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod878e15d5_5337_4289_b425_82955b0b6b38.slice/crio-017c05d0fdfc03beba4b3e1473d8a39fdc9188ec3f605d187eecfc254ff16f32 WatchSource:0}: Error finding container 017c05d0fdfc03beba4b3e1473d8a39fdc9188ec3f605d187eecfc254ff16f32: Status 404 returned error can't find the container with id 017c05d0fdfc03beba4b3e1473d8a39fdc9188ec3f605d187eecfc254ff16f32 Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.518431 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w"] Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.782405 4991 generic.go:334] "Generic (PLEG): container finished" podID="16c1c66a-36eb-4fad-ae43-515856947e7e" containerID="5c070188452024155de31959d6c6cc8f13535c624b57b40aba12f633e2718bdd" exitCode=0 Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.782469 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhw29" event={"ID":"16c1c66a-36eb-4fad-ae43-515856947e7e","Type":"ContainerDied","Data":"5c070188452024155de31959d6c6cc8f13535c624b57b40aba12f633e2718bdd"} Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.782499 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhw29" event={"ID":"16c1c66a-36eb-4fad-ae43-515856947e7e","Type":"ContainerStarted","Data":"05959e5544148fad0d119b25e73f9939462152980468fc5258108ee9451d7734"} Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.784855 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" event={"ID":"03389c9a-9320-4556-8ddb-77e061a1a6c8","Type":"ContainerStarted","Data":"59cde80438441d5b229965220bbbe6d83e25dcb08623fe29ec26e0ec842470a6"} Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.787447 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5488d6bb46-jlx2g" event={"ID":"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d","Type":"ContainerStarted","Data":"c912a1e28f26be9b97f720823b69cb6c96a67a6db0dcbb9674f3f300c0bd1836"} Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.787519 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5488d6bb46-jlx2g" event={"ID":"e20c83c6-725c-4aa0-bd20-ff3ac632dd8d","Type":"ContainerStarted","Data":"0ef5b6f87629be032cac77c8f75b9b8afdee38cd9da6b3b135d2678ed9f2324a"} Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.789049 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" event={"ID":"2c7406a6-af30-4f22-b109-9ea7e8cc2efe","Type":"ContainerStarted","Data":"94ed600b70929e3ba523481b22aef6a33e406b5f5e78e81a92d3778782bb2f9b"} Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.790607 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2l665" event={"ID":"878e15d5-5337-4289-b425-82955b0b6b38","Type":"ContainerStarted","Data":"017c05d0fdfc03beba4b3e1473d8a39fdc9188ec3f605d187eecfc254ff16f32"} Oct 06 08:33:06 crc kubenswrapper[4991]: I1006 08:33:06.835594 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5488d6bb46-jlx2g" podStartSLOduration=1.8355731 podStartE2EDuration="1.8355731s" podCreationTimestamp="2025-10-06 08:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:33:06.832771264 +0000 UTC m=+838.570521285" watchObservedRunningTime="2025-10-06 08:33:06.8355731 +0000 UTC m=+838.573323131" Oct 06 08:33:07 crc kubenswrapper[4991]: I1006 08:33:07.800176 4991 generic.go:334] "Generic (PLEG): container finished" podID="16c1c66a-36eb-4fad-ae43-515856947e7e" containerID="463f8a736ab5d9716dd66a4bb1fb19a4f25e9e93adbe58f7ff0c8cf8def1f2c8" exitCode=0 Oct 06 08:33:07 crc kubenswrapper[4991]: I1006 08:33:07.800274 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhw29" event={"ID":"16c1c66a-36eb-4fad-ae43-515856947e7e","Type":"ContainerDied","Data":"463f8a736ab5d9716dd66a4bb1fb19a4f25e9e93adbe58f7ff0c8cf8def1f2c8"} Oct 06 08:33:08 crc kubenswrapper[4991]: I1006 08:33:08.813690 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2l665" event={"ID":"878e15d5-5337-4289-b425-82955b0b6b38","Type":"ContainerStarted","Data":"b95ac65416dbb7414f5c879e48af2f5e57d2555438099d1639cd3fe34ac45317"} Oct 06 08:33:08 crc kubenswrapper[4991]: I1006 08:33:08.815936 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jzjv8" event={"ID":"7a7a3abc-344b-429e-a4eb-d62138e60de4","Type":"ContainerStarted","Data":"12d3511db95d8880bf108301a1f565ac882bfa8be4b6778eeac356b1feb439d6"} Oct 06 08:33:08 crc kubenswrapper[4991]: I1006 08:33:08.816027 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:08 crc kubenswrapper[4991]: I1006 08:33:08.817332 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" event={"ID":"03389c9a-9320-4556-8ddb-77e061a1a6c8","Type":"ContainerStarted","Data":"cb01c07e3c68762beee05f233493240313f56bdcf572b1ffaa9e859ffab4a3b9"} Oct 06 08:33:08 crc kubenswrapper[4991]: I1006 08:33:08.819808 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" event={"ID":"2c7406a6-af30-4f22-b109-9ea7e8cc2efe","Type":"ContainerStarted","Data":"b2a4048386b14b229737eb7755391606cd2064d44452563295900b7b4aa8e6b7"} Oct 06 08:33:08 crc kubenswrapper[4991]: I1006 08:33:08.820351 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" Oct 06 08:33:08 crc kubenswrapper[4991]: I1006 08:33:08.844971 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jzjv8" podStartSLOduration=0.84017142 podStartE2EDuration="3.844950021s" podCreationTimestamp="2025-10-06 08:33:05 +0000 UTC" firstStartedPulling="2025-10-06 08:33:05.502463181 +0000 UTC m=+837.240213202" lastFinishedPulling="2025-10-06 08:33:08.507241782 +0000 UTC m=+840.244991803" observedRunningTime="2025-10-06 08:33:08.836899391 +0000 UTC m=+840.574649402" watchObservedRunningTime="2025-10-06 08:33:08.844950021 +0000 UTC m=+840.582700042" Oct 06 08:33:08 crc kubenswrapper[4991]: I1006 08:33:08.853380 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-m6d2w" podStartSLOduration=1.874986889 podStartE2EDuration="3.853360151s" podCreationTimestamp="2025-10-06 08:33:05 +0000 UTC" firstStartedPulling="2025-10-06 08:33:06.529453716 +0000 UTC m=+838.267203747" lastFinishedPulling="2025-10-06 08:33:08.507826978 +0000 UTC m=+840.245577009" observedRunningTime="2025-10-06 08:33:08.849229008 +0000 UTC m=+840.586979029" watchObservedRunningTime="2025-10-06 08:33:08.853360151 +0000 UTC m=+840.591110162" Oct 06 08:33:08 crc kubenswrapper[4991]: I1006 08:33:08.868655 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" podStartSLOduration=1.388756188 podStartE2EDuration="3.868638979s" podCreationTimestamp="2025-10-06 08:33:05 +0000 UTC" firstStartedPulling="2025-10-06 08:33:06.035209796 +0000 UTC m=+837.772959827" lastFinishedPulling="2025-10-06 08:33:08.515092597 +0000 UTC m=+840.252842618" observedRunningTime="2025-10-06 08:33:08.865322588 +0000 UTC m=+840.603072619" watchObservedRunningTime="2025-10-06 08:33:08.868638979 +0000 UTC m=+840.606389000" Oct 06 08:33:09 crc kubenswrapper[4991]: I1006 08:33:09.828164 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhw29" event={"ID":"16c1c66a-36eb-4fad-ae43-515856947e7e","Type":"ContainerStarted","Data":"dfb1b64d81863de5009e2e7f4a38d6c16c104e8b16fb00312f6fb736400513e4"} Oct 06 08:33:10 crc kubenswrapper[4991]: I1006 08:33:10.834339 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2l665" event={"ID":"878e15d5-5337-4289-b425-82955b0b6b38","Type":"ContainerStarted","Data":"1e22df365ce27029b6c65c20a7aa9cf4dddb427c2b4e4f7aa1640671fb80b364"} Oct 06 08:33:10 crc kubenswrapper[4991]: I1006 08:33:10.849173 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hhw29" podStartSLOduration=4.860925878 podStartE2EDuration="6.849154329s" podCreationTimestamp="2025-10-06 08:33:04 +0000 UTC" firstStartedPulling="2025-10-06 08:33:06.786842908 +0000 UTC m=+838.524592929" lastFinishedPulling="2025-10-06 08:33:08.775071349 +0000 UTC m=+840.512821380" observedRunningTime="2025-10-06 08:33:09.855024683 +0000 UTC m=+841.592774704" watchObservedRunningTime="2025-10-06 08:33:10.849154329 +0000 UTC m=+842.586904350" Oct 06 08:33:10 crc kubenswrapper[4991]: I1006 08:33:10.853271 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-2l665" podStartSLOduration=1.5314430909999999 podStartE2EDuration="5.853256651s" podCreationTimestamp="2025-10-06 08:33:05 +0000 UTC" firstStartedPulling="2025-10-06 08:33:06.195202842 +0000 UTC m=+837.932952863" lastFinishedPulling="2025-10-06 08:33:10.517016402 +0000 UTC m=+842.254766423" observedRunningTime="2025-10-06 08:33:10.847956537 +0000 UTC m=+842.585706558" watchObservedRunningTime="2025-10-06 08:33:10.853256651 +0000 UTC m=+842.591006672" Oct 06 08:33:15 crc kubenswrapper[4991]: I1006 08:33:15.232219 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:15 crc kubenswrapper[4991]: I1006 08:33:15.233526 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:15 crc kubenswrapper[4991]: I1006 08:33:15.277871 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:15 crc kubenswrapper[4991]: I1006 08:33:15.462574 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jzjv8" Oct 06 08:33:15 crc kubenswrapper[4991]: I1006 08:33:15.832257 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:15 crc kubenswrapper[4991]: I1006 08:33:15.832389 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:15 crc kubenswrapper[4991]: I1006 08:33:15.839791 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:15 crc kubenswrapper[4991]: I1006 08:33:15.872421 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5488d6bb46-jlx2g" Oct 06 08:33:15 crc kubenswrapper[4991]: I1006 08:33:15.935198 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-kp5gc"] Oct 06 08:33:15 crc kubenswrapper[4991]: I1006 08:33:15.947566 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:16 crc kubenswrapper[4991]: I1006 08:33:16.011588 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhw29"] Oct 06 08:33:17 crc kubenswrapper[4991]: I1006 08:33:17.877416 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hhw29" podUID="16c1c66a-36eb-4fad-ae43-515856947e7e" containerName="registry-server" containerID="cri-o://dfb1b64d81863de5009e2e7f4a38d6c16c104e8b16fb00312f6fb736400513e4" gracePeriod=2 Oct 06 08:33:18 crc kubenswrapper[4991]: I1006 08:33:18.887680 4991 generic.go:334] "Generic (PLEG): container finished" podID="16c1c66a-36eb-4fad-ae43-515856947e7e" containerID="dfb1b64d81863de5009e2e7f4a38d6c16c104e8b16fb00312f6fb736400513e4" exitCode=0 Oct 06 08:33:18 crc kubenswrapper[4991]: I1006 08:33:18.887789 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhw29" event={"ID":"16c1c66a-36eb-4fad-ae43-515856947e7e","Type":"ContainerDied","Data":"dfb1b64d81863de5009e2e7f4a38d6c16c104e8b16fb00312f6fb736400513e4"} Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.412748 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.613375 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bnlz\" (UniqueName: \"kubernetes.io/projected/16c1c66a-36eb-4fad-ae43-515856947e7e-kube-api-access-2bnlz\") pod \"16c1c66a-36eb-4fad-ae43-515856947e7e\" (UID: \"16c1c66a-36eb-4fad-ae43-515856947e7e\") " Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.613440 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c1c66a-36eb-4fad-ae43-515856947e7e-utilities\") pod \"16c1c66a-36eb-4fad-ae43-515856947e7e\" (UID: \"16c1c66a-36eb-4fad-ae43-515856947e7e\") " Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.613559 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c1c66a-36eb-4fad-ae43-515856947e7e-catalog-content\") pod \"16c1c66a-36eb-4fad-ae43-515856947e7e\" (UID: \"16c1c66a-36eb-4fad-ae43-515856947e7e\") " Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.614737 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c1c66a-36eb-4fad-ae43-515856947e7e-utilities" (OuterVolumeSpecName: "utilities") pod "16c1c66a-36eb-4fad-ae43-515856947e7e" (UID: "16c1c66a-36eb-4fad-ae43-515856947e7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.624524 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c1c66a-36eb-4fad-ae43-515856947e7e-kube-api-access-2bnlz" (OuterVolumeSpecName: "kube-api-access-2bnlz") pod "16c1c66a-36eb-4fad-ae43-515856947e7e" (UID: "16c1c66a-36eb-4fad-ae43-515856947e7e"). InnerVolumeSpecName "kube-api-access-2bnlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.680854 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c1c66a-36eb-4fad-ae43-515856947e7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16c1c66a-36eb-4fad-ae43-515856947e7e" (UID: "16c1c66a-36eb-4fad-ae43-515856947e7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.715021 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bnlz\" (UniqueName: \"kubernetes.io/projected/16c1c66a-36eb-4fad-ae43-515856947e7e-kube-api-access-2bnlz\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.715072 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c1c66a-36eb-4fad-ae43-515856947e7e-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.715090 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c1c66a-36eb-4fad-ae43-515856947e7e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.894821 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hhw29" event={"ID":"16c1c66a-36eb-4fad-ae43-515856947e7e","Type":"ContainerDied","Data":"05959e5544148fad0d119b25e73f9939462152980468fc5258108ee9451d7734"} Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.894872 4991 scope.go:117] "RemoveContainer" containerID="dfb1b64d81863de5009e2e7f4a38d6c16c104e8b16fb00312f6fb736400513e4" Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.894938 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hhw29" Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.922198 4991 scope.go:117] "RemoveContainer" containerID="463f8a736ab5d9716dd66a4bb1fb19a4f25e9e93adbe58f7ff0c8cf8def1f2c8" Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.938808 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hhw29"] Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.945680 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hhw29"] Oct 06 08:33:19 crc kubenswrapper[4991]: I1006 08:33:19.959199 4991 scope.go:117] "RemoveContainer" containerID="5c070188452024155de31959d6c6cc8f13535c624b57b40aba12f633e2718bdd" Oct 06 08:33:20 crc kubenswrapper[4991]: E1006 08:33:20.032836 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c1c66a_36eb_4fad_ae43_515856947e7e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c1c66a_36eb_4fad_ae43_515856947e7e.slice/crio-05959e5544148fad0d119b25e73f9939462152980468fc5258108ee9451d7734\": RecentStats: unable to find data in memory cache]" Oct 06 08:33:21 crc kubenswrapper[4991]: I1006 08:33:21.253682 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16c1c66a-36eb-4fad-ae43-515856947e7e" path="/var/lib/kubelet/pods/16c1c66a-36eb-4fad-ae43-515856947e7e/volumes" Oct 06 08:33:25 crc kubenswrapper[4991]: I1006 08:33:25.427295 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-2nqhm" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.193438 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct"] Oct 06 08:33:38 crc kubenswrapper[4991]: E1006 08:33:38.194240 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c1c66a-36eb-4fad-ae43-515856947e7e" containerName="registry-server" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.194257 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c1c66a-36eb-4fad-ae43-515856947e7e" containerName="registry-server" Oct 06 08:33:38 crc kubenswrapper[4991]: E1006 08:33:38.194286 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c1c66a-36eb-4fad-ae43-515856947e7e" containerName="extract-content" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.194316 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c1c66a-36eb-4fad-ae43-515856947e7e" containerName="extract-content" Oct 06 08:33:38 crc kubenswrapper[4991]: E1006 08:33:38.194328 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c1c66a-36eb-4fad-ae43-515856947e7e" containerName="extract-utilities" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.194335 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c1c66a-36eb-4fad-ae43-515856947e7e" containerName="extract-utilities" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.194454 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c1c66a-36eb-4fad-ae43-515856947e7e" containerName="registry-server" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.195256 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.197730 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.202283 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct"] Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.272232 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88652a70-4e73-407d-a897-e9a6613a7fc8-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct\" (UID: \"88652a70-4e73-407d-a897-e9a6613a7fc8\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.272393 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t66t\" (UniqueName: \"kubernetes.io/projected/88652a70-4e73-407d-a897-e9a6613a7fc8-kube-api-access-7t66t\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct\" (UID: \"88652a70-4e73-407d-a897-e9a6613a7fc8\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.272522 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88652a70-4e73-407d-a897-e9a6613a7fc8-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct\" (UID: \"88652a70-4e73-407d-a897-e9a6613a7fc8\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.373945 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88652a70-4e73-407d-a897-e9a6613a7fc8-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct\" (UID: \"88652a70-4e73-407d-a897-e9a6613a7fc8\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.374050 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88652a70-4e73-407d-a897-e9a6613a7fc8-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct\" (UID: \"88652a70-4e73-407d-a897-e9a6613a7fc8\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.374101 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t66t\" (UniqueName: \"kubernetes.io/projected/88652a70-4e73-407d-a897-e9a6613a7fc8-kube-api-access-7t66t\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct\" (UID: \"88652a70-4e73-407d-a897-e9a6613a7fc8\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.374500 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88652a70-4e73-407d-a897-e9a6613a7fc8-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct\" (UID: \"88652a70-4e73-407d-a897-e9a6613a7fc8\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.374500 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88652a70-4e73-407d-a897-e9a6613a7fc8-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct\" (UID: \"88652a70-4e73-407d-a897-e9a6613a7fc8\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.391661 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t66t\" (UniqueName: \"kubernetes.io/projected/88652a70-4e73-407d-a897-e9a6613a7fc8-kube-api-access-7t66t\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct\" (UID: \"88652a70-4e73-407d-a897-e9a6613a7fc8\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.512223 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:38 crc kubenswrapper[4991]: I1006 08:33:38.908033 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct"] Oct 06 08:33:39 crc kubenswrapper[4991]: I1006 08:33:39.014263 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" event={"ID":"88652a70-4e73-407d-a897-e9a6613a7fc8","Type":"ContainerStarted","Data":"62e8cef351ed04ea35884ba6ad67f814dc3c1e0d4534304cd4641b61906ba4ad"} Oct 06 08:33:40 crc kubenswrapper[4991]: I1006 08:33:40.022705 4991 generic.go:334] "Generic (PLEG): container finished" podID="88652a70-4e73-407d-a897-e9a6613a7fc8" containerID="7152d08123645486643f3a13b1ca449541f32615d8fe6fb79adb87e8e5d2ce3f" exitCode=0 Oct 06 08:33:40 crc kubenswrapper[4991]: I1006 08:33:40.022834 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" event={"ID":"88652a70-4e73-407d-a897-e9a6613a7fc8","Type":"ContainerDied","Data":"7152d08123645486643f3a13b1ca449541f32615d8fe6fb79adb87e8e5d2ce3f"} Oct 06 08:33:40 crc kubenswrapper[4991]: I1006 08:33:40.974767 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-kp5gc" podUID="c941e944-a837-41ff-90b0-29464fc3f02d" containerName="console" containerID="cri-o://26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1" gracePeriod=15 Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.148808 4991 patch_prober.go:28] interesting pod/console-f9d7485db-kp5gc container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.149415 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-kp5gc" podUID="c941e944-a837-41ff-90b0-29464fc3f02d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.386742 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-kp5gc_c941e944-a837-41ff-90b0-29464fc3f02d/console/0.log" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.386811 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.414333 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c941e944-a837-41ff-90b0-29464fc3f02d-console-serving-cert\") pod \"c941e944-a837-41ff-90b0-29464fc3f02d\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.414414 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-console-config\") pod \"c941e944-a837-41ff-90b0-29464fc3f02d\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.414451 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-service-ca\") pod \"c941e944-a837-41ff-90b0-29464fc3f02d\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.414493 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4c8j\" (UniqueName: \"kubernetes.io/projected/c941e944-a837-41ff-90b0-29464fc3f02d-kube-api-access-c4c8j\") pod \"c941e944-a837-41ff-90b0-29464fc3f02d\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.414531 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c941e944-a837-41ff-90b0-29464fc3f02d-console-oauth-config\") pod \"c941e944-a837-41ff-90b0-29464fc3f02d\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.414570 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-oauth-serving-cert\") pod \"c941e944-a837-41ff-90b0-29464fc3f02d\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.414642 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-trusted-ca-bundle\") pod \"c941e944-a837-41ff-90b0-29464fc3f02d\" (UID: \"c941e944-a837-41ff-90b0-29464fc3f02d\") " Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.415636 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c941e944-a837-41ff-90b0-29464fc3f02d" (UID: "c941e944-a837-41ff-90b0-29464fc3f02d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.416249 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-console-config" (OuterVolumeSpecName: "console-config") pod "c941e944-a837-41ff-90b0-29464fc3f02d" (UID: "c941e944-a837-41ff-90b0-29464fc3f02d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.435980 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-service-ca" (OuterVolumeSpecName: "service-ca") pod "c941e944-a837-41ff-90b0-29464fc3f02d" (UID: "c941e944-a837-41ff-90b0-29464fc3f02d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.436756 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c941e944-a837-41ff-90b0-29464fc3f02d" (UID: "c941e944-a837-41ff-90b0-29464fc3f02d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.440491 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c941e944-a837-41ff-90b0-29464fc3f02d-kube-api-access-c4c8j" (OuterVolumeSpecName: "kube-api-access-c4c8j") pod "c941e944-a837-41ff-90b0-29464fc3f02d" (UID: "c941e944-a837-41ff-90b0-29464fc3f02d"). InnerVolumeSpecName "kube-api-access-c4c8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.442555 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c941e944-a837-41ff-90b0-29464fc3f02d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c941e944-a837-41ff-90b0-29464fc3f02d" (UID: "c941e944-a837-41ff-90b0-29464fc3f02d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.442581 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c941e944-a837-41ff-90b0-29464fc3f02d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c941e944-a837-41ff-90b0-29464fc3f02d" (UID: "c941e944-a837-41ff-90b0-29464fc3f02d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.515750 4991 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c941e944-a837-41ff-90b0-29464fc3f02d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.515789 4991 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.515804 4991 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.515816 4991 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c941e944-a837-41ff-90b0-29464fc3f02d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.515828 4991 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.515839 4991 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c941e944-a837-41ff-90b0-29464fc3f02d-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:41 crc kubenswrapper[4991]: I1006 08:33:41.515851 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4c8j\" (UniqueName: \"kubernetes.io/projected/c941e944-a837-41ff-90b0-29464fc3f02d-kube-api-access-c4c8j\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:42 crc kubenswrapper[4991]: I1006 08:33:42.045340 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-kp5gc_c941e944-a837-41ff-90b0-29464fc3f02d/console/0.log" Oct 06 08:33:42 crc kubenswrapper[4991]: I1006 08:33:42.045719 4991 generic.go:334] "Generic (PLEG): container finished" podID="c941e944-a837-41ff-90b0-29464fc3f02d" containerID="26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1" exitCode=2 Oct 06 08:33:42 crc kubenswrapper[4991]: I1006 08:33:42.045802 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kp5gc" Oct 06 08:33:42 crc kubenswrapper[4991]: I1006 08:33:42.045822 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kp5gc" event={"ID":"c941e944-a837-41ff-90b0-29464fc3f02d","Type":"ContainerDied","Data":"26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1"} Oct 06 08:33:42 crc kubenswrapper[4991]: I1006 08:33:42.045905 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kp5gc" event={"ID":"c941e944-a837-41ff-90b0-29464fc3f02d","Type":"ContainerDied","Data":"c7b4747145a0e8e248704bc7f148d93e5d5ffec6d93f46bdde0f648f083c1051"} Oct 06 08:33:42 crc kubenswrapper[4991]: I1006 08:33:42.045936 4991 scope.go:117] "RemoveContainer" containerID="26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1" Oct 06 08:33:42 crc kubenswrapper[4991]: I1006 08:33:42.049911 4991 generic.go:334] "Generic (PLEG): container finished" podID="88652a70-4e73-407d-a897-e9a6613a7fc8" containerID="59d3d9855af0d997f45ce4929efb7dca87f1dc510c43e558a48f0ab56379fab2" exitCode=0 Oct 06 08:33:42 crc kubenswrapper[4991]: I1006 08:33:42.049956 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" event={"ID":"88652a70-4e73-407d-a897-e9a6613a7fc8","Type":"ContainerDied","Data":"59d3d9855af0d997f45ce4929efb7dca87f1dc510c43e558a48f0ab56379fab2"} Oct 06 08:33:42 crc kubenswrapper[4991]: I1006 08:33:42.077843 4991 scope.go:117] "RemoveContainer" containerID="26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1" Oct 06 08:33:42 crc kubenswrapper[4991]: E1006 08:33:42.078332 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1\": container with ID starting with 26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1 not found: ID does not exist" containerID="26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1" Oct 06 08:33:42 crc kubenswrapper[4991]: I1006 08:33:42.078366 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1"} err="failed to get container status \"26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1\": rpc error: code = NotFound desc = could not find container \"26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1\": container with ID starting with 26f0dcec1eb08730bc39cd18f266d74c07718f321e4730dc2d2b8c6969cbf2e1 not found: ID does not exist" Oct 06 08:33:42 crc kubenswrapper[4991]: I1006 08:33:42.079896 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-kp5gc"] Oct 06 08:33:42 crc kubenswrapper[4991]: I1006 08:33:42.082747 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-kp5gc"] Oct 06 08:33:43 crc kubenswrapper[4991]: I1006 08:33:43.058816 4991 generic.go:334] "Generic (PLEG): container finished" podID="88652a70-4e73-407d-a897-e9a6613a7fc8" containerID="00d989624a847626ff791bb2b944babc1d7fe3b70d4687d84c613ed4bb4bfcee" exitCode=0 Oct 06 08:33:43 crc kubenswrapper[4991]: I1006 08:33:43.058874 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" event={"ID":"88652a70-4e73-407d-a897-e9a6613a7fc8","Type":"ContainerDied","Data":"00d989624a847626ff791bb2b944babc1d7fe3b70d4687d84c613ed4bb4bfcee"} Oct 06 08:33:43 crc kubenswrapper[4991]: I1006 08:33:43.253019 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c941e944-a837-41ff-90b0-29464fc3f02d" path="/var/lib/kubelet/pods/c941e944-a837-41ff-90b0-29464fc3f02d/volumes" Oct 06 08:33:44 crc kubenswrapper[4991]: I1006 08:33:44.329227 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:44 crc kubenswrapper[4991]: I1006 08:33:44.351594 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88652a70-4e73-407d-a897-e9a6613a7fc8-bundle\") pod \"88652a70-4e73-407d-a897-e9a6613a7fc8\" (UID: \"88652a70-4e73-407d-a897-e9a6613a7fc8\") " Oct 06 08:33:44 crc kubenswrapper[4991]: I1006 08:33:44.351746 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88652a70-4e73-407d-a897-e9a6613a7fc8-util\") pod \"88652a70-4e73-407d-a897-e9a6613a7fc8\" (UID: \"88652a70-4e73-407d-a897-e9a6613a7fc8\") " Oct 06 08:33:44 crc kubenswrapper[4991]: I1006 08:33:44.351812 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t66t\" (UniqueName: \"kubernetes.io/projected/88652a70-4e73-407d-a897-e9a6613a7fc8-kube-api-access-7t66t\") pod \"88652a70-4e73-407d-a897-e9a6613a7fc8\" (UID: \"88652a70-4e73-407d-a897-e9a6613a7fc8\") " Oct 06 08:33:44 crc kubenswrapper[4991]: I1006 08:33:44.352450 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88652a70-4e73-407d-a897-e9a6613a7fc8-bundle" (OuterVolumeSpecName: "bundle") pod "88652a70-4e73-407d-a897-e9a6613a7fc8" (UID: "88652a70-4e73-407d-a897-e9a6613a7fc8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:33:44 crc kubenswrapper[4991]: I1006 08:33:44.358853 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88652a70-4e73-407d-a897-e9a6613a7fc8-kube-api-access-7t66t" (OuterVolumeSpecName: "kube-api-access-7t66t") pod "88652a70-4e73-407d-a897-e9a6613a7fc8" (UID: "88652a70-4e73-407d-a897-e9a6613a7fc8"). InnerVolumeSpecName "kube-api-access-7t66t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:33:44 crc kubenswrapper[4991]: I1006 08:33:44.365686 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88652a70-4e73-407d-a897-e9a6613a7fc8-util" (OuterVolumeSpecName: "util") pod "88652a70-4e73-407d-a897-e9a6613a7fc8" (UID: "88652a70-4e73-407d-a897-e9a6613a7fc8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:33:44 crc kubenswrapper[4991]: I1006 08:33:44.453742 4991 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88652a70-4e73-407d-a897-e9a6613a7fc8-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:44 crc kubenswrapper[4991]: I1006 08:33:44.453779 4991 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88652a70-4e73-407d-a897-e9a6613a7fc8-util\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:44 crc kubenswrapper[4991]: I1006 08:33:44.453791 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t66t\" (UniqueName: \"kubernetes.io/projected/88652a70-4e73-407d-a897-e9a6613a7fc8-kube-api-access-7t66t\") on node \"crc\" DevicePath \"\"" Oct 06 08:33:45 crc kubenswrapper[4991]: I1006 08:33:45.080211 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" event={"ID":"88652a70-4e73-407d-a897-e9a6613a7fc8","Type":"ContainerDied","Data":"62e8cef351ed04ea35884ba6ad67f814dc3c1e0d4534304cd4641b61906ba4ad"} Oct 06 08:33:45 crc kubenswrapper[4991]: I1006 08:33:45.080634 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62e8cef351ed04ea35884ba6ad67f814dc3c1e0d4534304cd4641b61906ba4ad" Oct 06 08:33:45 crc kubenswrapper[4991]: I1006 08:33:45.080287 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.370233 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2"] Oct 06 08:33:53 crc kubenswrapper[4991]: E1006 08:33:53.371029 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c941e944-a837-41ff-90b0-29464fc3f02d" containerName="console" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.371044 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c941e944-a837-41ff-90b0-29464fc3f02d" containerName="console" Oct 06 08:33:53 crc kubenswrapper[4991]: E1006 08:33:53.371059 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88652a70-4e73-407d-a897-e9a6613a7fc8" containerName="util" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.371067 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="88652a70-4e73-407d-a897-e9a6613a7fc8" containerName="util" Oct 06 08:33:53 crc kubenswrapper[4991]: E1006 08:33:53.371078 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88652a70-4e73-407d-a897-e9a6613a7fc8" containerName="pull" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.371085 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="88652a70-4e73-407d-a897-e9a6613a7fc8" containerName="pull" Oct 06 08:33:53 crc kubenswrapper[4991]: E1006 08:33:53.371099 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88652a70-4e73-407d-a897-e9a6613a7fc8" containerName="extract" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.371106 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="88652a70-4e73-407d-a897-e9a6613a7fc8" containerName="extract" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.371226 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="88652a70-4e73-407d-a897-e9a6613a7fc8" containerName="extract" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.371242 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c941e944-a837-41ff-90b0-29464fc3f02d" containerName="console" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.371648 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.374880 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.375152 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.375521 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.375521 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.382377 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2"] Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.388165 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7jkk8" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.469670 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dblsf\" (UniqueName: \"kubernetes.io/projected/757d57fb-94e6-41dc-8266-38149c0e932a-kube-api-access-dblsf\") pod \"metallb-operator-controller-manager-647564c7bd-wrdk2\" (UID: \"757d57fb-94e6-41dc-8266-38149c0e932a\") " pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.469732 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/757d57fb-94e6-41dc-8266-38149c0e932a-webhook-cert\") pod \"metallb-operator-controller-manager-647564c7bd-wrdk2\" (UID: \"757d57fb-94e6-41dc-8266-38149c0e932a\") " pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.469770 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/757d57fb-94e6-41dc-8266-38149c0e932a-apiservice-cert\") pod \"metallb-operator-controller-manager-647564c7bd-wrdk2\" (UID: \"757d57fb-94e6-41dc-8266-38149c0e932a\") " pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.570638 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dblsf\" (UniqueName: \"kubernetes.io/projected/757d57fb-94e6-41dc-8266-38149c0e932a-kube-api-access-dblsf\") pod \"metallb-operator-controller-manager-647564c7bd-wrdk2\" (UID: \"757d57fb-94e6-41dc-8266-38149c0e932a\") " pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.570910 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/757d57fb-94e6-41dc-8266-38149c0e932a-webhook-cert\") pod \"metallb-operator-controller-manager-647564c7bd-wrdk2\" (UID: \"757d57fb-94e6-41dc-8266-38149c0e932a\") " pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.571022 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/757d57fb-94e6-41dc-8266-38149c0e932a-apiservice-cert\") pod \"metallb-operator-controller-manager-647564c7bd-wrdk2\" (UID: \"757d57fb-94e6-41dc-8266-38149c0e932a\") " pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.576340 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/757d57fb-94e6-41dc-8266-38149c0e932a-apiservice-cert\") pod \"metallb-operator-controller-manager-647564c7bd-wrdk2\" (UID: \"757d57fb-94e6-41dc-8266-38149c0e932a\") " pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.577032 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/757d57fb-94e6-41dc-8266-38149c0e932a-webhook-cert\") pod \"metallb-operator-controller-manager-647564c7bd-wrdk2\" (UID: \"757d57fb-94e6-41dc-8266-38149c0e932a\") " pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.592031 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dblsf\" (UniqueName: \"kubernetes.io/projected/757d57fb-94e6-41dc-8266-38149c0e932a-kube-api-access-dblsf\") pod \"metallb-operator-controller-manager-647564c7bd-wrdk2\" (UID: \"757d57fb-94e6-41dc-8266-38149c0e932a\") " pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.630122 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x"] Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.630812 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.636575 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.636638 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-t4mnt" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.638093 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.645553 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x"] Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.688783 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.773027 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00076896-8a73-4599-a610-ce6dca6e6495-apiservice-cert\") pod \"metallb-operator-webhook-server-97d9b8f56-6282x\" (UID: \"00076896-8a73-4599-a610-ce6dca6e6495\") " pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.773080 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrrvk\" (UniqueName: \"kubernetes.io/projected/00076896-8a73-4599-a610-ce6dca6e6495-kube-api-access-wrrvk\") pod \"metallb-operator-webhook-server-97d9b8f56-6282x\" (UID: \"00076896-8a73-4599-a610-ce6dca6e6495\") " pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.773118 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00076896-8a73-4599-a610-ce6dca6e6495-webhook-cert\") pod \"metallb-operator-webhook-server-97d9b8f56-6282x\" (UID: \"00076896-8a73-4599-a610-ce6dca6e6495\") " pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.874116 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00076896-8a73-4599-a610-ce6dca6e6495-apiservice-cert\") pod \"metallb-operator-webhook-server-97d9b8f56-6282x\" (UID: \"00076896-8a73-4599-a610-ce6dca6e6495\") " pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.874517 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrrvk\" (UniqueName: \"kubernetes.io/projected/00076896-8a73-4599-a610-ce6dca6e6495-kube-api-access-wrrvk\") pod \"metallb-operator-webhook-server-97d9b8f56-6282x\" (UID: \"00076896-8a73-4599-a610-ce6dca6e6495\") " pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.874567 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00076896-8a73-4599-a610-ce6dca6e6495-webhook-cert\") pod \"metallb-operator-webhook-server-97d9b8f56-6282x\" (UID: \"00076896-8a73-4599-a610-ce6dca6e6495\") " pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.900968 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00076896-8a73-4599-a610-ce6dca6e6495-webhook-cert\") pod \"metallb-operator-webhook-server-97d9b8f56-6282x\" (UID: \"00076896-8a73-4599-a610-ce6dca6e6495\") " pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.907035 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrrvk\" (UniqueName: \"kubernetes.io/projected/00076896-8a73-4599-a610-ce6dca6e6495-kube-api-access-wrrvk\") pod \"metallb-operator-webhook-server-97d9b8f56-6282x\" (UID: \"00076896-8a73-4599-a610-ce6dca6e6495\") " pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.913649 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00076896-8a73-4599-a610-ce6dca6e6495-apiservice-cert\") pod \"metallb-operator-webhook-server-97d9b8f56-6282x\" (UID: \"00076896-8a73-4599-a610-ce6dca6e6495\") " pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:33:53 crc kubenswrapper[4991]: I1006 08:33:53.948811 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:33:54 crc kubenswrapper[4991]: I1006 08:33:54.186177 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2"] Oct 06 08:33:54 crc kubenswrapper[4991]: W1006 08:33:54.195686 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757d57fb_94e6_41dc_8266_38149c0e932a.slice/crio-a0fe9e5bf9a8477fc842fc3bb16a3740dca2360f7ee6089881bf4366d92f741e WatchSource:0}: Error finding container a0fe9e5bf9a8477fc842fc3bb16a3740dca2360f7ee6089881bf4366d92f741e: Status 404 returned error can't find the container with id a0fe9e5bf9a8477fc842fc3bb16a3740dca2360f7ee6089881bf4366d92f741e Oct 06 08:33:54 crc kubenswrapper[4991]: I1006 08:33:54.414214 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x"] Oct 06 08:33:54 crc kubenswrapper[4991]: W1006 08:33:54.421560 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00076896_8a73_4599_a610_ce6dca6e6495.slice/crio-252e211362ddc7b0a6df7219d2d1e63be31234b0e9059a5286cdab9c7034eac7 WatchSource:0}: Error finding container 252e211362ddc7b0a6df7219d2d1e63be31234b0e9059a5286cdab9c7034eac7: Status 404 returned error can't find the container with id 252e211362ddc7b0a6df7219d2d1e63be31234b0e9059a5286cdab9c7034eac7 Oct 06 08:33:55 crc kubenswrapper[4991]: I1006 08:33:55.140203 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" event={"ID":"757d57fb-94e6-41dc-8266-38149c0e932a","Type":"ContainerStarted","Data":"a0fe9e5bf9a8477fc842fc3bb16a3740dca2360f7ee6089881bf4366d92f741e"} Oct 06 08:33:55 crc kubenswrapper[4991]: I1006 08:33:55.141533 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" event={"ID":"00076896-8a73-4599-a610-ce6dca6e6495","Type":"ContainerStarted","Data":"252e211362ddc7b0a6df7219d2d1e63be31234b0e9059a5286cdab9c7034eac7"} Oct 06 08:33:57 crc kubenswrapper[4991]: I1006 08:33:57.154279 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" event={"ID":"757d57fb-94e6-41dc-8266-38149c0e932a","Type":"ContainerStarted","Data":"69758058925b584f65bf5b794d1ea2668d3ffe04b34c152f208e945068ea7021"} Oct 06 08:33:57 crc kubenswrapper[4991]: I1006 08:33:57.154610 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:33:57 crc kubenswrapper[4991]: I1006 08:33:57.179066 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" podStartSLOduration=1.660264867 podStartE2EDuration="4.179048376s" podCreationTimestamp="2025-10-06 08:33:53 +0000 UTC" firstStartedPulling="2025-10-06 08:33:54.197400256 +0000 UTC m=+885.935150277" lastFinishedPulling="2025-10-06 08:33:56.716183755 +0000 UTC m=+888.453933786" observedRunningTime="2025-10-06 08:33:57.16967192 +0000 UTC m=+888.907421971" watchObservedRunningTime="2025-10-06 08:33:57.179048376 +0000 UTC m=+888.916798397" Oct 06 08:33:57 crc kubenswrapper[4991]: I1006 08:33:57.529205 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:33:57 crc kubenswrapper[4991]: I1006 08:33:57.529267 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:33:59 crc kubenswrapper[4991]: I1006 08:33:59.165668 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" event={"ID":"00076896-8a73-4599-a610-ce6dca6e6495","Type":"ContainerStarted","Data":"2b21f01477a1fe47a589b8b90a15e57dedbac504e3393e128e464c4c9e76b1c7"} Oct 06 08:33:59 crc kubenswrapper[4991]: I1006 08:33:59.165808 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:33:59 crc kubenswrapper[4991]: I1006 08:33:59.183655 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" podStartSLOduration=2.194958711 podStartE2EDuration="6.183636863s" podCreationTimestamp="2025-10-06 08:33:53 +0000 UTC" firstStartedPulling="2025-10-06 08:33:54.426492003 +0000 UTC m=+886.164242024" lastFinishedPulling="2025-10-06 08:33:58.415170155 +0000 UTC m=+890.152920176" observedRunningTime="2025-10-06 08:33:59.179594441 +0000 UTC m=+890.917344462" watchObservedRunningTime="2025-10-06 08:33:59.183636863 +0000 UTC m=+890.921386884" Oct 06 08:34:13 crc kubenswrapper[4991]: I1006 08:34:13.955444 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-97d9b8f56-6282x" Oct 06 08:34:27 crc kubenswrapper[4991]: I1006 08:34:27.529001 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:34:27 crc kubenswrapper[4991]: I1006 08:34:27.529740 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:34:33 crc kubenswrapper[4991]: I1006 08:34:33.691915 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-647564c7bd-wrdk2" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.583081 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-gb555"] Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.587844 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.590135 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.590607 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.590646 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-72kxg" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.594015 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls"] Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.594795 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.606829 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.625685 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls"] Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.670548 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zmsbl"] Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.671515 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zmsbl" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.673658 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.673700 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.673699 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-4gbwt" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.675257 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.679107 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-gh7zn"] Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.680147 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.681588 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.691592 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-gh7zn"] Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.709791 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1c2c84a-e1a5-4bad-b383-83790b446262-metrics-certs\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.709858 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trxdq\" (UniqueName: \"kubernetes.io/projected/a134e93b-350e-4bfe-9f9f-b743a7b256c6-kube-api-access-trxdq\") pod \"frr-k8s-webhook-server-64bf5d555-4w7ls\" (UID: \"a134e93b-350e-4bfe-9f9f-b743a7b256c6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.709887 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a1c2c84a-e1a5-4bad-b383-83790b446262-metrics\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.709917 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cspxv\" (UniqueName: \"kubernetes.io/projected/a1c2c84a-e1a5-4bad-b383-83790b446262-kube-api-access-cspxv\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.709952 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a134e93b-350e-4bfe-9f9f-b743a7b256c6-cert\") pod \"frr-k8s-webhook-server-64bf5d555-4w7ls\" (UID: \"a134e93b-350e-4bfe-9f9f-b743a7b256c6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.710015 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a1c2c84a-e1a5-4bad-b383-83790b446262-frr-conf\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.710034 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a1c2c84a-e1a5-4bad-b383-83790b446262-reloader\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.710052 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a1c2c84a-e1a5-4bad-b383-83790b446262-frr-startup\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.710082 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a1c2c84a-e1a5-4bad-b383-83790b446262-frr-sockets\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.811590 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a134e93b-350e-4bfe-9f9f-b743a7b256c6-cert\") pod \"frr-k8s-webhook-server-64bf5d555-4w7ls\" (UID: \"a134e93b-350e-4bfe-9f9f-b743a7b256c6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.811629 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-metrics-certs\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.811663 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-metallb-excludel2\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.811683 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7chr\" (UniqueName: \"kubernetes.io/projected/02de489e-94ff-4cb3-b3c2-7c45f6b64f33-kube-api-access-w7chr\") pod \"controller-68d546b9d8-gh7zn\" (UID: \"02de489e-94ff-4cb3-b3c2-7c45f6b64f33\") " pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.811889 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a1c2c84a-e1a5-4bad-b383-83790b446262-reloader\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.811926 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a1c2c84a-e1a5-4bad-b383-83790b446262-frr-conf\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.811960 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a1c2c84a-e1a5-4bad-b383-83790b446262-frr-startup\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.811999 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a1c2c84a-e1a5-4bad-b383-83790b446262-frr-sockets\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.812057 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-memberlist\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.812101 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1c2c84a-e1a5-4bad-b383-83790b446262-metrics-certs\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.812165 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trxdq\" (UniqueName: \"kubernetes.io/projected/a134e93b-350e-4bfe-9f9f-b743a7b256c6-kube-api-access-trxdq\") pod \"frr-k8s-webhook-server-64bf5d555-4w7ls\" (UID: \"a134e93b-350e-4bfe-9f9f-b743a7b256c6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.812200 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a1c2c84a-e1a5-4bad-b383-83790b446262-metrics\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.812221 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02de489e-94ff-4cb3-b3c2-7c45f6b64f33-cert\") pod \"controller-68d546b9d8-gh7zn\" (UID: \"02de489e-94ff-4cb3-b3c2-7c45f6b64f33\") " pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.812586 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de489e-94ff-4cb3-b3c2-7c45f6b64f33-metrics-certs\") pod \"controller-68d546b9d8-gh7zn\" (UID: \"02de489e-94ff-4cb3-b3c2-7c45f6b64f33\") " pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.812633 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cspxv\" (UniqueName: \"kubernetes.io/projected/a1c2c84a-e1a5-4bad-b383-83790b446262-kube-api-access-cspxv\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.812728 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a1c2c84a-e1a5-4bad-b383-83790b446262-metrics\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.812773 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a1c2c84a-e1a5-4bad-b383-83790b446262-frr-sockets\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.812671 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqq8d\" (UniqueName: \"kubernetes.io/projected/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-kube-api-access-qqq8d\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.812949 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a1c2c84a-e1a5-4bad-b383-83790b446262-reloader\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.813062 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a1c2c84a-e1a5-4bad-b383-83790b446262-frr-conf\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.813450 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a1c2c84a-e1a5-4bad-b383-83790b446262-frr-startup\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.819903 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1c2c84a-e1a5-4bad-b383-83790b446262-metrics-certs\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.826506 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a134e93b-350e-4bfe-9f9f-b743a7b256c6-cert\") pod \"frr-k8s-webhook-server-64bf5d555-4w7ls\" (UID: \"a134e93b-350e-4bfe-9f9f-b743a7b256c6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.829334 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cspxv\" (UniqueName: \"kubernetes.io/projected/a1c2c84a-e1a5-4bad-b383-83790b446262-kube-api-access-cspxv\") pod \"frr-k8s-gb555\" (UID: \"a1c2c84a-e1a5-4bad-b383-83790b446262\") " pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.829774 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trxdq\" (UniqueName: \"kubernetes.io/projected/a134e93b-350e-4bfe-9f9f-b743a7b256c6-kube-api-access-trxdq\") pod \"frr-k8s-webhook-server-64bf5d555-4w7ls\" (UID: \"a134e93b-350e-4bfe-9f9f-b743a7b256c6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.910861 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.914253 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-memberlist\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.914326 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02de489e-94ff-4cb3-b3c2-7c45f6b64f33-cert\") pod \"controller-68d546b9d8-gh7zn\" (UID: \"02de489e-94ff-4cb3-b3c2-7c45f6b64f33\") " pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.914347 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de489e-94ff-4cb3-b3c2-7c45f6b64f33-metrics-certs\") pod \"controller-68d546b9d8-gh7zn\" (UID: \"02de489e-94ff-4cb3-b3c2-7c45f6b64f33\") " pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.914376 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqq8d\" (UniqueName: \"kubernetes.io/projected/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-kube-api-access-qqq8d\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.914395 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-metrics-certs\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.914427 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-metallb-excludel2\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.914443 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7chr\" (UniqueName: \"kubernetes.io/projected/02de489e-94ff-4cb3-b3c2-7c45f6b64f33-kube-api-access-w7chr\") pod \"controller-68d546b9d8-gh7zn\" (UID: \"02de489e-94ff-4cb3-b3c2-7c45f6b64f33\") " pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:34 crc kubenswrapper[4991]: E1006 08:34:34.914398 4991 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 08:34:34 crc kubenswrapper[4991]: E1006 08:34:34.914631 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-memberlist podName:3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb nodeName:}" failed. No retries permitted until 2025-10-06 08:34:35.414612827 +0000 UTC m=+927.152362838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-memberlist") pod "speaker-zmsbl" (UID: "3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb") : secret "metallb-memberlist" not found Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.915272 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-metallb-excludel2\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.917124 4991 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.917733 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-metrics-certs\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.917914 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.919370 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02de489e-94ff-4cb3-b3c2-7c45f6b64f33-metrics-certs\") pod \"controller-68d546b9d8-gh7zn\" (UID: \"02de489e-94ff-4cb3-b3c2-7c45f6b64f33\") " pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.928253 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02de489e-94ff-4cb3-b3c2-7c45f6b64f33-cert\") pod \"controller-68d546b9d8-gh7zn\" (UID: \"02de489e-94ff-4cb3-b3c2-7c45f6b64f33\") " pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.932486 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7chr\" (UniqueName: \"kubernetes.io/projected/02de489e-94ff-4cb3-b3c2-7c45f6b64f33-kube-api-access-w7chr\") pod \"controller-68d546b9d8-gh7zn\" (UID: \"02de489e-94ff-4cb3-b3c2-7c45f6b64f33\") " pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.937854 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqq8d\" (UniqueName: \"kubernetes.io/projected/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-kube-api-access-qqq8d\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:34 crc kubenswrapper[4991]: I1006 08:34:34.998331 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:35 crc kubenswrapper[4991]: I1006 08:34:35.119483 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls"] Oct 06 08:34:35 crc kubenswrapper[4991]: W1006 08:34:35.120435 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda134e93b_350e_4bfe_9f9f_b743a7b256c6.slice/crio-ae2e658ca118a08e7b7aefe42e2c6fba327ca97a54b15cb9066ab48a817f1867 WatchSource:0}: Error finding container ae2e658ca118a08e7b7aefe42e2c6fba327ca97a54b15cb9066ab48a817f1867: Status 404 returned error can't find the container with id ae2e658ca118a08e7b7aefe42e2c6fba327ca97a54b15cb9066ab48a817f1867 Oct 06 08:34:35 crc kubenswrapper[4991]: I1006 08:34:35.374976 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" event={"ID":"a134e93b-350e-4bfe-9f9f-b743a7b256c6","Type":"ContainerStarted","Data":"ae2e658ca118a08e7b7aefe42e2c6fba327ca97a54b15cb9066ab48a817f1867"} Oct 06 08:34:35 crc kubenswrapper[4991]: I1006 08:34:35.375952 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gb555" event={"ID":"a1c2c84a-e1a5-4bad-b383-83790b446262","Type":"ContainerStarted","Data":"89e3de64f3de432f398c1a4fab6a6a16c28538ae943595b4c4f0cc3b469bc4f5"} Oct 06 08:34:35 crc kubenswrapper[4991]: I1006 08:34:35.416844 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-gh7zn"] Oct 06 08:34:35 crc kubenswrapper[4991]: I1006 08:34:35.420511 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-memberlist\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:35 crc kubenswrapper[4991]: E1006 08:34:35.420735 4991 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 08:34:35 crc kubenswrapper[4991]: E1006 08:34:35.420823 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-memberlist podName:3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb nodeName:}" failed. No retries permitted until 2025-10-06 08:34:36.420797732 +0000 UTC m=+928.158547773 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-memberlist") pod "speaker-zmsbl" (UID: "3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb") : secret "metallb-memberlist" not found Oct 06 08:34:35 crc kubenswrapper[4991]: W1006 08:34:35.421503 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02de489e_94ff_4cb3_b3c2_7c45f6b64f33.slice/crio-8ffe89c03c28e9a974cddd1a748f460a113249767bebefa088f2042d2950488b WatchSource:0}: Error finding container 8ffe89c03c28e9a974cddd1a748f460a113249767bebefa088f2042d2950488b: Status 404 returned error can't find the container with id 8ffe89c03c28e9a974cddd1a748f460a113249767bebefa088f2042d2950488b Oct 06 08:34:36 crc kubenswrapper[4991]: I1006 08:34:36.382598 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-gh7zn" event={"ID":"02de489e-94ff-4cb3-b3c2-7c45f6b64f33","Type":"ContainerStarted","Data":"95024b313caae8d55efbaaae2b88890d453766acfe6ce200673b5b6c1692081d"} Oct 06 08:34:36 crc kubenswrapper[4991]: I1006 08:34:36.382954 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:36 crc kubenswrapper[4991]: I1006 08:34:36.382969 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-gh7zn" event={"ID":"02de489e-94ff-4cb3-b3c2-7c45f6b64f33","Type":"ContainerStarted","Data":"b71acab9580b4c9358370ba6507b5f40b23eb41a7bc1c346a50786e1c1e0d62b"} Oct 06 08:34:36 crc kubenswrapper[4991]: I1006 08:34:36.382980 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-gh7zn" event={"ID":"02de489e-94ff-4cb3-b3c2-7c45f6b64f33","Type":"ContainerStarted","Data":"8ffe89c03c28e9a974cddd1a748f460a113249767bebefa088f2042d2950488b"} Oct 06 08:34:36 crc kubenswrapper[4991]: I1006 08:34:36.406787 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-gh7zn" podStartSLOduration=2.406766549 podStartE2EDuration="2.406766549s" podCreationTimestamp="2025-10-06 08:34:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:34:36.401809733 +0000 UTC m=+928.139559744" watchObservedRunningTime="2025-10-06 08:34:36.406766549 +0000 UTC m=+928.144516570" Oct 06 08:34:36 crc kubenswrapper[4991]: I1006 08:34:36.434886 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-memberlist\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:36 crc kubenswrapper[4991]: I1006 08:34:36.446499 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb-memberlist\") pod \"speaker-zmsbl\" (UID: \"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb\") " pod="metallb-system/speaker-zmsbl" Oct 06 08:34:36 crc kubenswrapper[4991]: I1006 08:34:36.492217 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zmsbl" Oct 06 08:34:36 crc kubenswrapper[4991]: W1006 08:34:36.514660 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cd0c363_f3b8_4c9e_9a1c_c2e7902133bb.slice/crio-3d5e689389b7c25ba07420d77ebed84e7c64462a7c35be405c18f5df46d6ed0e WatchSource:0}: Error finding container 3d5e689389b7c25ba07420d77ebed84e7c64462a7c35be405c18f5df46d6ed0e: Status 404 returned error can't find the container with id 3d5e689389b7c25ba07420d77ebed84e7c64462a7c35be405c18f5df46d6ed0e Oct 06 08:34:37 crc kubenswrapper[4991]: I1006 08:34:37.394628 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zmsbl" event={"ID":"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb","Type":"ContainerStarted","Data":"01a06333d98fdd99eda595ec4437745cbd5f5fdd71094ad89d09b8aa0e0955bf"} Oct 06 08:34:37 crc kubenswrapper[4991]: I1006 08:34:37.394976 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zmsbl" event={"ID":"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb","Type":"ContainerStarted","Data":"a5d0f565233c35906c9acea7657703533676f6749ba44dc6b4f1d4d649a53d83"} Oct 06 08:34:37 crc kubenswrapper[4991]: I1006 08:34:37.394986 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zmsbl" event={"ID":"3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb","Type":"ContainerStarted","Data":"3d5e689389b7c25ba07420d77ebed84e7c64462a7c35be405c18f5df46d6ed0e"} Oct 06 08:34:37 crc kubenswrapper[4991]: I1006 08:34:37.395547 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zmsbl" Oct 06 08:34:37 crc kubenswrapper[4991]: I1006 08:34:37.408907 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zmsbl" podStartSLOduration=3.408892027 podStartE2EDuration="3.408892027s" podCreationTimestamp="2025-10-06 08:34:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:34:37.406880722 +0000 UTC m=+929.144630743" watchObservedRunningTime="2025-10-06 08:34:37.408892027 +0000 UTC m=+929.146642048" Oct 06 08:34:42 crc kubenswrapper[4991]: I1006 08:34:42.426510 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" event={"ID":"a134e93b-350e-4bfe-9f9f-b743a7b256c6","Type":"ContainerStarted","Data":"78fe8d46eb903be0b2a16168e6e78c25725a566d0ea2e6ab51b187403def83fa"} Oct 06 08:34:42 crc kubenswrapper[4991]: I1006 08:34:42.427234 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" Oct 06 08:34:42 crc kubenswrapper[4991]: I1006 08:34:42.429891 4991 generic.go:334] "Generic (PLEG): container finished" podID="a1c2c84a-e1a5-4bad-b383-83790b446262" containerID="ead1d8f5bb8b33bd88e69368180d3a58bbedef1a50bde003773905c5a0b903e4" exitCode=0 Oct 06 08:34:42 crc kubenswrapper[4991]: I1006 08:34:42.429952 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gb555" event={"ID":"a1c2c84a-e1a5-4bad-b383-83790b446262","Type":"ContainerDied","Data":"ead1d8f5bb8b33bd88e69368180d3a58bbedef1a50bde003773905c5a0b903e4"} Oct 06 08:34:42 crc kubenswrapper[4991]: I1006 08:34:42.458718 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" podStartSLOduration=1.974567535 podStartE2EDuration="8.458684518s" podCreationTimestamp="2025-10-06 08:34:34 +0000 UTC" firstStartedPulling="2025-10-06 08:34:35.122787463 +0000 UTC m=+926.860537504" lastFinishedPulling="2025-10-06 08:34:41.606904466 +0000 UTC m=+933.344654487" observedRunningTime="2025-10-06 08:34:42.447649488 +0000 UTC m=+934.185399519" watchObservedRunningTime="2025-10-06 08:34:42.458684518 +0000 UTC m=+934.196434589" Oct 06 08:34:43 crc kubenswrapper[4991]: I1006 08:34:43.441600 4991 generic.go:334] "Generic (PLEG): container finished" podID="a1c2c84a-e1a5-4bad-b383-83790b446262" containerID="98a3d8ee4c6f422345669b588b823aea4ddb217b760ddcfbcf5f651aec7dbf3f" exitCode=0 Oct 06 08:34:43 crc kubenswrapper[4991]: I1006 08:34:43.442439 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gb555" event={"ID":"a1c2c84a-e1a5-4bad-b383-83790b446262","Type":"ContainerDied","Data":"98a3d8ee4c6f422345669b588b823aea4ddb217b760ddcfbcf5f651aec7dbf3f"} Oct 06 08:34:44 crc kubenswrapper[4991]: I1006 08:34:44.452918 4991 generic.go:334] "Generic (PLEG): container finished" podID="a1c2c84a-e1a5-4bad-b383-83790b446262" containerID="f219269fc80e231212f2d7797bc34827000ea4105f133a0435255d3512127ff0" exitCode=0 Oct 06 08:34:44 crc kubenswrapper[4991]: I1006 08:34:44.452993 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gb555" event={"ID":"a1c2c84a-e1a5-4bad-b383-83790b446262","Type":"ContainerDied","Data":"f219269fc80e231212f2d7797bc34827000ea4105f133a0435255d3512127ff0"} Oct 06 08:34:45 crc kubenswrapper[4991]: I1006 08:34:45.464255 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gb555" event={"ID":"a1c2c84a-e1a5-4bad-b383-83790b446262","Type":"ContainerStarted","Data":"24b058da049286e0ac8f902dc94b226d1782258e4adaab611f985dafe7bb21a7"} Oct 06 08:34:45 crc kubenswrapper[4991]: I1006 08:34:45.464615 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gb555" event={"ID":"a1c2c84a-e1a5-4bad-b383-83790b446262","Type":"ContainerStarted","Data":"54c6b76937e06025ff76037c5e327450c479274c0bf25827f7b2af4443c9e0f6"} Oct 06 08:34:45 crc kubenswrapper[4991]: I1006 08:34:45.464627 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gb555" event={"ID":"a1c2c84a-e1a5-4bad-b383-83790b446262","Type":"ContainerStarted","Data":"b0a795fc4d88da53455173818f648e5fd76e674822ce1b3d9e4086d2224081cd"} Oct 06 08:34:45 crc kubenswrapper[4991]: I1006 08:34:45.464637 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gb555" event={"ID":"a1c2c84a-e1a5-4bad-b383-83790b446262","Type":"ContainerStarted","Data":"cb6d82e1332b7993a7155260019c56400d663403d7047436f5b77d9233b82ef3"} Oct 06 08:34:45 crc kubenswrapper[4991]: I1006 08:34:45.464646 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gb555" event={"ID":"a1c2c84a-e1a5-4bad-b383-83790b446262","Type":"ContainerStarted","Data":"31821eceed8154fa9aed1bed0a718f8d3177fc04dcf52df3b754c8144167035b"} Oct 06 08:34:46 crc kubenswrapper[4991]: I1006 08:34:46.475668 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-gb555" event={"ID":"a1c2c84a-e1a5-4bad-b383-83790b446262","Type":"ContainerStarted","Data":"5a664f64db807a2ff76168b5fdcd57d94ca2de78db7ed0b2924e05912747aa18"} Oct 06 08:34:46 crc kubenswrapper[4991]: I1006 08:34:46.476103 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:46 crc kubenswrapper[4991]: I1006 08:34:46.495232 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zmsbl" Oct 06 08:34:46 crc kubenswrapper[4991]: I1006 08:34:46.503978 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-gb555" podStartSLOduration=5.984048414 podStartE2EDuration="12.503959576s" podCreationTimestamp="2025-10-06 08:34:34 +0000 UTC" firstStartedPulling="2025-10-06 08:34:35.070501475 +0000 UTC m=+926.808251496" lastFinishedPulling="2025-10-06 08:34:41.590412617 +0000 UTC m=+933.328162658" observedRunningTime="2025-10-06 08:34:46.498615991 +0000 UTC m=+938.236366012" watchObservedRunningTime="2025-10-06 08:34:46.503959576 +0000 UTC m=+938.241709607" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.244422 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk"] Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.246654 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.249013 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.250640 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk"] Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.414310 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk\" (UID: \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.414430 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk\" (UID: \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.414526 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4nzk\" (UniqueName: \"kubernetes.io/projected/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-kube-api-access-v4nzk\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk\" (UID: \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.516093 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk\" (UID: \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.516152 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk\" (UID: \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.516189 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4nzk\" (UniqueName: \"kubernetes.io/projected/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-kube-api-access-v4nzk\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk\" (UID: \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.516672 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk\" (UID: \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.516730 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk\" (UID: \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.556387 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4nzk\" (UniqueName: \"kubernetes.io/projected/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-kube-api-access-v4nzk\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk\" (UID: \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.578939 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:34:48 crc kubenswrapper[4991]: I1006 08:34:48.772794 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk"] Oct 06 08:34:49 crc kubenswrapper[4991]: I1006 08:34:49.495059 4991 generic.go:334] "Generic (PLEG): container finished" podID="8e1b7145-64f5-4381-86f7-23a8b1bb16ec" containerID="e177a51b0c5e7bb7bc6c41bc68ec281a4ef73d18f9636bea353ee11b0804ebd3" exitCode=0 Oct 06 08:34:49 crc kubenswrapper[4991]: I1006 08:34:49.495131 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" event={"ID":"8e1b7145-64f5-4381-86f7-23a8b1bb16ec","Type":"ContainerDied","Data":"e177a51b0c5e7bb7bc6c41bc68ec281a4ef73d18f9636bea353ee11b0804ebd3"} Oct 06 08:34:49 crc kubenswrapper[4991]: I1006 08:34:49.495459 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" event={"ID":"8e1b7145-64f5-4381-86f7-23a8b1bb16ec","Type":"ContainerStarted","Data":"ed145f578a184ca7a19b6be2b3df3c1aabd802173a5a9c9c39f6c09c6a0ec6ba"} Oct 06 08:34:49 crc kubenswrapper[4991]: I1006 08:34:49.911139 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:49 crc kubenswrapper[4991]: I1006 08:34:49.950748 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:54 crc kubenswrapper[4991]: I1006 08:34:54.915328 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-gb555" Oct 06 08:34:54 crc kubenswrapper[4991]: I1006 08:34:54.926933 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-4w7ls" Oct 06 08:34:55 crc kubenswrapper[4991]: I1006 08:34:55.002623 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-gh7zn" Oct 06 08:34:57 crc kubenswrapper[4991]: I1006 08:34:57.529768 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:34:57 crc kubenswrapper[4991]: I1006 08:34:57.530131 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:34:57 crc kubenswrapper[4991]: I1006 08:34:57.530183 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:34:57 crc kubenswrapper[4991]: I1006 08:34:57.530998 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee6239739727eb6d7bb018a70f54ea31ce396adbac7977b5d2326c033722faa0"} pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:34:57 crc kubenswrapper[4991]: I1006 08:34:57.531074 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" containerID="cri-o://ee6239739727eb6d7bb018a70f54ea31ce396adbac7977b5d2326c033722faa0" gracePeriod=600 Oct 06 08:34:57 crc kubenswrapper[4991]: I1006 08:34:57.543805 4991 generic.go:334] "Generic (PLEG): container finished" podID="8e1b7145-64f5-4381-86f7-23a8b1bb16ec" containerID="677ea8c34deeac7e792a058e96b427a064f7ad58e8784b879a11a348ee5f249e" exitCode=0 Oct 06 08:34:57 crc kubenswrapper[4991]: I1006 08:34:57.543885 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" event={"ID":"8e1b7145-64f5-4381-86f7-23a8b1bb16ec","Type":"ContainerDied","Data":"677ea8c34deeac7e792a058e96b427a064f7ad58e8784b879a11a348ee5f249e"} Oct 06 08:34:58 crc kubenswrapper[4991]: I1006 08:34:58.554341 4991 generic.go:334] "Generic (PLEG): container finished" podID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerID="ee6239739727eb6d7bb018a70f54ea31ce396adbac7977b5d2326c033722faa0" exitCode=0 Oct 06 08:34:58 crc kubenswrapper[4991]: I1006 08:34:58.554417 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerDied","Data":"ee6239739727eb6d7bb018a70f54ea31ce396adbac7977b5d2326c033722faa0"} Oct 06 08:34:58 crc kubenswrapper[4991]: I1006 08:34:58.555131 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"e1369062046a805994e1e0d5f87b5a6e887447735123010879df4c4305faa2ba"} Oct 06 08:34:58 crc kubenswrapper[4991]: I1006 08:34:58.555165 4991 scope.go:117] "RemoveContainer" containerID="3169b67ddb39c04fb8f22ea7b9f7ae55cd068df65648f9ad55f3275e8f92dd3b" Oct 06 08:34:58 crc kubenswrapper[4991]: I1006 08:34:58.559380 4991 generic.go:334] "Generic (PLEG): container finished" podID="8e1b7145-64f5-4381-86f7-23a8b1bb16ec" containerID="e8df7da23d89011ade0720dd16c6f87299f8eab5ed24dc8dbc6674e6c981d6a4" exitCode=0 Oct 06 08:34:58 crc kubenswrapper[4991]: I1006 08:34:58.559498 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" event={"ID":"8e1b7145-64f5-4381-86f7-23a8b1bb16ec","Type":"ContainerDied","Data":"e8df7da23d89011ade0720dd16c6f87299f8eab5ed24dc8dbc6674e6c981d6a4"} Oct 06 08:34:59 crc kubenswrapper[4991]: I1006 08:34:59.834127 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:34:59 crc kubenswrapper[4991]: I1006 08:34:59.962358 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-bundle\") pod \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\" (UID: \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\") " Oct 06 08:34:59 crc kubenswrapper[4991]: I1006 08:34:59.962438 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-util\") pod \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\" (UID: \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\") " Oct 06 08:34:59 crc kubenswrapper[4991]: I1006 08:34:59.962465 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4nzk\" (UniqueName: \"kubernetes.io/projected/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-kube-api-access-v4nzk\") pod \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\" (UID: \"8e1b7145-64f5-4381-86f7-23a8b1bb16ec\") " Oct 06 08:34:59 crc kubenswrapper[4991]: I1006 08:34:59.964826 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-bundle" (OuterVolumeSpecName: "bundle") pod "8e1b7145-64f5-4381-86f7-23a8b1bb16ec" (UID: "8e1b7145-64f5-4381-86f7-23a8b1bb16ec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:34:59 crc kubenswrapper[4991]: I1006 08:34:59.971520 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-kube-api-access-v4nzk" (OuterVolumeSpecName: "kube-api-access-v4nzk") pod "8e1b7145-64f5-4381-86f7-23a8b1bb16ec" (UID: "8e1b7145-64f5-4381-86f7-23a8b1bb16ec"). InnerVolumeSpecName "kube-api-access-v4nzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:34:59 crc kubenswrapper[4991]: I1006 08:34:59.989829 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-util" (OuterVolumeSpecName: "util") pod "8e1b7145-64f5-4381-86f7-23a8b1bb16ec" (UID: "8e1b7145-64f5-4381-86f7-23a8b1bb16ec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:35:00 crc kubenswrapper[4991]: I1006 08:35:00.064171 4991 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:00 crc kubenswrapper[4991]: I1006 08:35:00.064218 4991 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-util\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:00 crc kubenswrapper[4991]: I1006 08:35:00.064237 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4nzk\" (UniqueName: \"kubernetes.io/projected/8e1b7145-64f5-4381-86f7-23a8b1bb16ec-kube-api-access-v4nzk\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:00 crc kubenswrapper[4991]: I1006 08:35:00.584450 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" event={"ID":"8e1b7145-64f5-4381-86f7-23a8b1bb16ec","Type":"ContainerDied","Data":"ed145f578a184ca7a19b6be2b3df3c1aabd802173a5a9c9c39f6c09c6a0ec6ba"} Oct 06 08:35:00 crc kubenswrapper[4991]: I1006 08:35:00.584512 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed145f578a184ca7a19b6be2b3df3c1aabd802173a5a9c9c39f6c09c6a0ec6ba" Oct 06 08:35:00 crc kubenswrapper[4991]: I1006 08:35:00.584538 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk" Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.790718 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nxqzd"] Oct 06 08:35:06 crc kubenswrapper[4991]: E1006 08:35:06.791336 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1b7145-64f5-4381-86f7-23a8b1bb16ec" containerName="util" Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.791353 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1b7145-64f5-4381-86f7-23a8b1bb16ec" containerName="util" Oct 06 08:35:06 crc kubenswrapper[4991]: E1006 08:35:06.791374 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1b7145-64f5-4381-86f7-23a8b1bb16ec" containerName="extract" Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.791382 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1b7145-64f5-4381-86f7-23a8b1bb16ec" containerName="extract" Oct 06 08:35:06 crc kubenswrapper[4991]: E1006 08:35:06.791406 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1b7145-64f5-4381-86f7-23a8b1bb16ec" containerName="pull" Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.791414 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1b7145-64f5-4381-86f7-23a8b1bb16ec" containerName="pull" Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.791541 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1b7145-64f5-4381-86f7-23a8b1bb16ec" containerName="extract" Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.792036 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nxqzd" Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.794474 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.795593 4991 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-6qzzj" Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.797773 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.815998 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nxqzd"] Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.853997 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbfdr\" (UniqueName: \"kubernetes.io/projected/9c3dd45e-9c88-48e9-8976-20df64a99255-kube-api-access-pbfdr\") pod \"cert-manager-operator-controller-manager-57cd46d6d-nxqzd\" (UID: \"9c3dd45e-9c88-48e9-8976-20df64a99255\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nxqzd" Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.955089 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbfdr\" (UniqueName: \"kubernetes.io/projected/9c3dd45e-9c88-48e9-8976-20df64a99255-kube-api-access-pbfdr\") pod \"cert-manager-operator-controller-manager-57cd46d6d-nxqzd\" (UID: \"9c3dd45e-9c88-48e9-8976-20df64a99255\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nxqzd" Oct 06 08:35:06 crc kubenswrapper[4991]: I1006 08:35:06.975973 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbfdr\" (UniqueName: \"kubernetes.io/projected/9c3dd45e-9c88-48e9-8976-20df64a99255-kube-api-access-pbfdr\") pod \"cert-manager-operator-controller-manager-57cd46d6d-nxqzd\" (UID: \"9c3dd45e-9c88-48e9-8976-20df64a99255\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nxqzd" Oct 06 08:35:07 crc kubenswrapper[4991]: I1006 08:35:07.109318 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nxqzd" Oct 06 08:35:07 crc kubenswrapper[4991]: I1006 08:35:07.569606 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nxqzd"] Oct 06 08:35:07 crc kubenswrapper[4991]: W1006 08:35:07.584960 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c3dd45e_9c88_48e9_8976_20df64a99255.slice/crio-10c82c9f343c34a0a6c8c6de35d077bd8ed324fd3004f15505fd06ea53a53f2c WatchSource:0}: Error finding container 10c82c9f343c34a0a6c8c6de35d077bd8ed324fd3004f15505fd06ea53a53f2c: Status 404 returned error can't find the container with id 10c82c9f343c34a0a6c8c6de35d077bd8ed324fd3004f15505fd06ea53a53f2c Oct 06 08:35:07 crc kubenswrapper[4991]: I1006 08:35:07.647322 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nxqzd" event={"ID":"9c3dd45e-9c88-48e9-8976-20df64a99255","Type":"ContainerStarted","Data":"10c82c9f343c34a0a6c8c6de35d077bd8ed324fd3004f15505fd06ea53a53f2c"} Oct 06 08:35:13 crc kubenswrapper[4991]: I1006 08:35:13.691865 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nxqzd" event={"ID":"9c3dd45e-9c88-48e9-8976-20df64a99255","Type":"ContainerStarted","Data":"70901edf590f48f5f1d3fdb95b8eeec4b3995ed66cbe44d6eeb0363e5aee2346"} Oct 06 08:35:13 crc kubenswrapper[4991]: I1006 08:35:13.722354 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nxqzd" podStartSLOduration=1.934709129 podStartE2EDuration="7.72228918s" podCreationTimestamp="2025-10-06 08:35:06 +0000 UTC" firstStartedPulling="2025-10-06 08:35:07.588885904 +0000 UTC m=+959.326635935" lastFinishedPulling="2025-10-06 08:35:13.376465965 +0000 UTC m=+965.114215986" observedRunningTime="2025-10-06 08:35:13.714063536 +0000 UTC m=+965.451813557" watchObservedRunningTime="2025-10-06 08:35:13.72228918 +0000 UTC m=+965.460039231" Oct 06 08:35:17 crc kubenswrapper[4991]: I1006 08:35:17.763213 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-f84wg"] Oct 06 08:35:17 crc kubenswrapper[4991]: I1006 08:35:17.764770 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-f84wg" Oct 06 08:35:17 crc kubenswrapper[4991]: I1006 08:35:17.766844 4991 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-v2msl" Oct 06 08:35:17 crc kubenswrapper[4991]: I1006 08:35:17.767031 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 06 08:35:17 crc kubenswrapper[4991]: I1006 08:35:17.767348 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 06 08:35:17 crc kubenswrapper[4991]: I1006 08:35:17.776395 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-f84wg"] Oct 06 08:35:17 crc kubenswrapper[4991]: I1006 08:35:17.901414 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a80db63a-219c-446f-aa49-748b3e9c9c38-bound-sa-token\") pod \"cert-manager-webhook-d969966f-f84wg\" (UID: \"a80db63a-219c-446f-aa49-748b3e9c9c38\") " pod="cert-manager/cert-manager-webhook-d969966f-f84wg" Oct 06 08:35:17 crc kubenswrapper[4991]: I1006 08:35:17.901879 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgjd8\" (UniqueName: \"kubernetes.io/projected/a80db63a-219c-446f-aa49-748b3e9c9c38-kube-api-access-qgjd8\") pod \"cert-manager-webhook-d969966f-f84wg\" (UID: \"a80db63a-219c-446f-aa49-748b3e9c9c38\") " pod="cert-manager/cert-manager-webhook-d969966f-f84wg" Oct 06 08:35:18 crc kubenswrapper[4991]: I1006 08:35:18.003656 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a80db63a-219c-446f-aa49-748b3e9c9c38-bound-sa-token\") pod \"cert-manager-webhook-d969966f-f84wg\" (UID: \"a80db63a-219c-446f-aa49-748b3e9c9c38\") " pod="cert-manager/cert-manager-webhook-d969966f-f84wg" Oct 06 08:35:18 crc kubenswrapper[4991]: I1006 08:35:18.003775 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgjd8\" (UniqueName: \"kubernetes.io/projected/a80db63a-219c-446f-aa49-748b3e9c9c38-kube-api-access-qgjd8\") pod \"cert-manager-webhook-d969966f-f84wg\" (UID: \"a80db63a-219c-446f-aa49-748b3e9c9c38\") " pod="cert-manager/cert-manager-webhook-d969966f-f84wg" Oct 06 08:35:18 crc kubenswrapper[4991]: I1006 08:35:18.027407 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a80db63a-219c-446f-aa49-748b3e9c9c38-bound-sa-token\") pod \"cert-manager-webhook-d969966f-f84wg\" (UID: \"a80db63a-219c-446f-aa49-748b3e9c9c38\") " pod="cert-manager/cert-manager-webhook-d969966f-f84wg" Oct 06 08:35:18 crc kubenswrapper[4991]: I1006 08:35:18.040689 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgjd8\" (UniqueName: \"kubernetes.io/projected/a80db63a-219c-446f-aa49-748b3e9c9c38-kube-api-access-qgjd8\") pod \"cert-manager-webhook-d969966f-f84wg\" (UID: \"a80db63a-219c-446f-aa49-748b3e9c9c38\") " pod="cert-manager/cert-manager-webhook-d969966f-f84wg" Oct 06 08:35:18 crc kubenswrapper[4991]: I1006 08:35:18.082312 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-f84wg" Oct 06 08:35:18 crc kubenswrapper[4991]: I1006 08:35:18.504027 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-f84wg"] Oct 06 08:35:18 crc kubenswrapper[4991]: W1006 08:35:18.520542 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda80db63a_219c_446f_aa49_748b3e9c9c38.slice/crio-18f5b9d877853d272980e3aea2a799efe1af39247baeddd6a2bde0c42fa59a43 WatchSource:0}: Error finding container 18f5b9d877853d272980e3aea2a799efe1af39247baeddd6a2bde0c42fa59a43: Status 404 returned error can't find the container with id 18f5b9d877853d272980e3aea2a799efe1af39247baeddd6a2bde0c42fa59a43 Oct 06 08:35:18 crc kubenswrapper[4991]: I1006 08:35:18.719005 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-f84wg" event={"ID":"a80db63a-219c-446f-aa49-748b3e9c9c38","Type":"ContainerStarted","Data":"18f5b9d877853d272980e3aea2a799efe1af39247baeddd6a2bde0c42fa59a43"} Oct 06 08:35:20 crc kubenswrapper[4991]: I1006 08:35:20.106496 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq"] Oct 06 08:35:20 crc kubenswrapper[4991]: I1006 08:35:20.107773 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq" Oct 06 08:35:20 crc kubenswrapper[4991]: I1006 08:35:20.110033 4991 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2rwmk" Oct 06 08:35:20 crc kubenswrapper[4991]: I1006 08:35:20.110983 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq"] Oct 06 08:35:20 crc kubenswrapper[4991]: I1006 08:35:20.241601 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/991468e7-6b42-4164-9b94-14a34c770f48-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-k9pnq\" (UID: \"991468e7-6b42-4164-9b94-14a34c770f48\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq" Oct 06 08:35:20 crc kubenswrapper[4991]: I1006 08:35:20.241984 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvjhz\" (UniqueName: \"kubernetes.io/projected/991468e7-6b42-4164-9b94-14a34c770f48-kube-api-access-nvjhz\") pod \"cert-manager-cainjector-7d9f95dbf-k9pnq\" (UID: \"991468e7-6b42-4164-9b94-14a34c770f48\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq" Oct 06 08:35:20 crc kubenswrapper[4991]: I1006 08:35:20.343236 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvjhz\" (UniqueName: \"kubernetes.io/projected/991468e7-6b42-4164-9b94-14a34c770f48-kube-api-access-nvjhz\") pod \"cert-manager-cainjector-7d9f95dbf-k9pnq\" (UID: \"991468e7-6b42-4164-9b94-14a34c770f48\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq" Oct 06 08:35:20 crc kubenswrapper[4991]: I1006 08:35:20.343402 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/991468e7-6b42-4164-9b94-14a34c770f48-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-k9pnq\" (UID: \"991468e7-6b42-4164-9b94-14a34c770f48\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq" Oct 06 08:35:20 crc kubenswrapper[4991]: I1006 08:35:20.363640 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvjhz\" (UniqueName: \"kubernetes.io/projected/991468e7-6b42-4164-9b94-14a34c770f48-kube-api-access-nvjhz\") pod \"cert-manager-cainjector-7d9f95dbf-k9pnq\" (UID: \"991468e7-6b42-4164-9b94-14a34c770f48\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq" Oct 06 08:35:20 crc kubenswrapper[4991]: I1006 08:35:20.379408 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/991468e7-6b42-4164-9b94-14a34c770f48-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-k9pnq\" (UID: \"991468e7-6b42-4164-9b94-14a34c770f48\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq" Oct 06 08:35:20 crc kubenswrapper[4991]: I1006 08:35:20.429943 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq" Oct 06 08:35:20 crc kubenswrapper[4991]: I1006 08:35:20.822948 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq"] Oct 06 08:35:21 crc kubenswrapper[4991]: I1006 08:35:21.741384 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq" event={"ID":"991468e7-6b42-4164-9b94-14a34c770f48","Type":"ContainerStarted","Data":"57fa3fd4606f693e21d13288a0dbc7914e389417e69a8c316538501cc3550373"} Oct 06 08:35:22 crc kubenswrapper[4991]: I1006 08:35:22.747917 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq" event={"ID":"991468e7-6b42-4164-9b94-14a34c770f48","Type":"ContainerStarted","Data":"0b5ca4ec71e04c54944990d01e4e5c29e97990520772556548c0841f58265a89"} Oct 06 08:35:22 crc kubenswrapper[4991]: I1006 08:35:22.749708 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-f84wg" event={"ID":"a80db63a-219c-446f-aa49-748b3e9c9c38","Type":"ContainerStarted","Data":"b49bad7d3c581c631dc0a8f0ef9e7a7c812642353a0e1cfef39229f87626e984"} Oct 06 08:35:22 crc kubenswrapper[4991]: I1006 08:35:22.749836 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-f84wg" Oct 06 08:35:22 crc kubenswrapper[4991]: I1006 08:35:22.769457 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-k9pnq" podStartSLOduration=1.231614255 podStartE2EDuration="2.769440623s" podCreationTimestamp="2025-10-06 08:35:20 +0000 UTC" firstStartedPulling="2025-10-06 08:35:20.83899382 +0000 UTC m=+972.576743861" lastFinishedPulling="2025-10-06 08:35:22.376820198 +0000 UTC m=+974.114570229" observedRunningTime="2025-10-06 08:35:22.767829056 +0000 UTC m=+974.505579107" watchObservedRunningTime="2025-10-06 08:35:22.769440623 +0000 UTC m=+974.507190644" Oct 06 08:35:22 crc kubenswrapper[4991]: I1006 08:35:22.791350 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-f84wg" podStartSLOduration=1.933171682 podStartE2EDuration="5.791326285s" podCreationTimestamp="2025-10-06 08:35:17 +0000 UTC" firstStartedPulling="2025-10-06 08:35:18.522826125 +0000 UTC m=+970.260576146" lastFinishedPulling="2025-10-06 08:35:22.380980688 +0000 UTC m=+974.118730749" observedRunningTime="2025-10-06 08:35:22.786108094 +0000 UTC m=+974.523858115" watchObservedRunningTime="2025-10-06 08:35:22.791326285 +0000 UTC m=+974.529076306" Oct 06 08:35:28 crc kubenswrapper[4991]: I1006 08:35:28.086286 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-f84wg" Oct 06 08:35:28 crc kubenswrapper[4991]: I1006 08:35:28.483949 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-7n4jj"] Oct 06 08:35:28 crc kubenswrapper[4991]: I1006 08:35:28.484709 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-7n4jj" Oct 06 08:35:28 crc kubenswrapper[4991]: I1006 08:35:28.487347 4991 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xz4dg" Oct 06 08:35:28 crc kubenswrapper[4991]: I1006 08:35:28.488761 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-7n4jj"] Oct 06 08:35:28 crc kubenswrapper[4991]: I1006 08:35:28.569622 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk8wf\" (UniqueName: \"kubernetes.io/projected/46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1-kube-api-access-kk8wf\") pod \"cert-manager-7d4cc89fcb-7n4jj\" (UID: \"46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1\") " pod="cert-manager/cert-manager-7d4cc89fcb-7n4jj" Oct 06 08:35:28 crc kubenswrapper[4991]: I1006 08:35:28.569657 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-7n4jj\" (UID: \"46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1\") " pod="cert-manager/cert-manager-7d4cc89fcb-7n4jj" Oct 06 08:35:28 crc kubenswrapper[4991]: I1006 08:35:28.671556 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk8wf\" (UniqueName: \"kubernetes.io/projected/46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1-kube-api-access-kk8wf\") pod \"cert-manager-7d4cc89fcb-7n4jj\" (UID: \"46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1\") " pod="cert-manager/cert-manager-7d4cc89fcb-7n4jj" Oct 06 08:35:28 crc kubenswrapper[4991]: I1006 08:35:28.671628 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-7n4jj\" (UID: \"46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1\") " pod="cert-manager/cert-manager-7d4cc89fcb-7n4jj" Oct 06 08:35:28 crc kubenswrapper[4991]: I1006 08:35:28.701964 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk8wf\" (UniqueName: \"kubernetes.io/projected/46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1-kube-api-access-kk8wf\") pod \"cert-manager-7d4cc89fcb-7n4jj\" (UID: \"46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1\") " pod="cert-manager/cert-manager-7d4cc89fcb-7n4jj" Oct 06 08:35:28 crc kubenswrapper[4991]: I1006 08:35:28.702058 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-7n4jj\" (UID: \"46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1\") " pod="cert-manager/cert-manager-7d4cc89fcb-7n4jj" Oct 06 08:35:28 crc kubenswrapper[4991]: I1006 08:35:28.816715 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-7n4jj" Oct 06 08:35:29 crc kubenswrapper[4991]: I1006 08:35:29.051483 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-7n4jj"] Oct 06 08:35:29 crc kubenswrapper[4991]: W1006 08:35:29.052237 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46ef1459_4df8_4a30_b8c9_5b0b26f4f5a1.slice/crio-043829ffb776e4aec54b191dddd2bf1842a1a202378c540d2220d36db4f5f8e9 WatchSource:0}: Error finding container 043829ffb776e4aec54b191dddd2bf1842a1a202378c540d2220d36db4f5f8e9: Status 404 returned error can't find the container with id 043829ffb776e4aec54b191dddd2bf1842a1a202378c540d2220d36db4f5f8e9 Oct 06 08:35:29 crc kubenswrapper[4991]: I1006 08:35:29.798497 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-7n4jj" event={"ID":"46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1","Type":"ContainerStarted","Data":"1bdf9fb0ff25f26380d6cea70979b4108b0cf032be6a912990b04afaf0dab553"} Oct 06 08:35:29 crc kubenswrapper[4991]: I1006 08:35:29.798955 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-7n4jj" event={"ID":"46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1","Type":"ContainerStarted","Data":"043829ffb776e4aec54b191dddd2bf1842a1a202378c540d2220d36db4f5f8e9"} Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.066769 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-7n4jj" podStartSLOduration=14.066752805 podStartE2EDuration="14.066752805s" podCreationTimestamp="2025-10-06 08:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:35:29.817358806 +0000 UTC m=+981.555108817" watchObservedRunningTime="2025-10-06 08:35:42.066752805 +0000 UTC m=+993.804502826" Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.068164 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lvqcn"] Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.068881 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lvqcn" Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.070591 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.070979 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-bsxcg" Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.072215 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.139457 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lvqcn"] Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.156801 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t44w\" (UniqueName: \"kubernetes.io/projected/33544971-863a-4b28-a1e5-6eddea8e37c0-kube-api-access-5t44w\") pod \"openstack-operator-index-lvqcn\" (UID: \"33544971-863a-4b28-a1e5-6eddea8e37c0\") " pod="openstack-operators/openstack-operator-index-lvqcn" Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.258275 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t44w\" (UniqueName: \"kubernetes.io/projected/33544971-863a-4b28-a1e5-6eddea8e37c0-kube-api-access-5t44w\") pod \"openstack-operator-index-lvqcn\" (UID: \"33544971-863a-4b28-a1e5-6eddea8e37c0\") " pod="openstack-operators/openstack-operator-index-lvqcn" Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.279794 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t44w\" (UniqueName: \"kubernetes.io/projected/33544971-863a-4b28-a1e5-6eddea8e37c0-kube-api-access-5t44w\") pod \"openstack-operator-index-lvqcn\" (UID: \"33544971-863a-4b28-a1e5-6eddea8e37c0\") " pod="openstack-operators/openstack-operator-index-lvqcn" Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.394295 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lvqcn" Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.831589 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lvqcn"] Oct 06 08:35:42 crc kubenswrapper[4991]: I1006 08:35:42.926238 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lvqcn" event={"ID":"33544971-863a-4b28-a1e5-6eddea8e37c0","Type":"ContainerStarted","Data":"aa261c76e7321f47448ef450163dbe2e52d71ba6232a34c5956bb7d3237dd83f"} Oct 06 08:35:44 crc kubenswrapper[4991]: I1006 08:35:44.850163 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-lvqcn"] Oct 06 08:35:45 crc kubenswrapper[4991]: I1006 08:35:45.461845 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-tzrrq"] Oct 06 08:35:45 crc kubenswrapper[4991]: I1006 08:35:45.463198 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tzrrq" Oct 06 08:35:45 crc kubenswrapper[4991]: I1006 08:35:45.468155 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tzrrq"] Oct 06 08:35:45 crc kubenswrapper[4991]: I1006 08:35:45.501611 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf2v5\" (UniqueName: \"kubernetes.io/projected/3fa73b60-5381-42ef-be66-f254fb2b80a1-kube-api-access-bf2v5\") pod \"openstack-operator-index-tzrrq\" (UID: \"3fa73b60-5381-42ef-be66-f254fb2b80a1\") " pod="openstack-operators/openstack-operator-index-tzrrq" Oct 06 08:35:45 crc kubenswrapper[4991]: I1006 08:35:45.603079 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf2v5\" (UniqueName: \"kubernetes.io/projected/3fa73b60-5381-42ef-be66-f254fb2b80a1-kube-api-access-bf2v5\") pod \"openstack-operator-index-tzrrq\" (UID: \"3fa73b60-5381-42ef-be66-f254fb2b80a1\") " pod="openstack-operators/openstack-operator-index-tzrrq" Oct 06 08:35:45 crc kubenswrapper[4991]: I1006 08:35:45.628611 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf2v5\" (UniqueName: \"kubernetes.io/projected/3fa73b60-5381-42ef-be66-f254fb2b80a1-kube-api-access-bf2v5\") pod \"openstack-operator-index-tzrrq\" (UID: \"3fa73b60-5381-42ef-be66-f254fb2b80a1\") " pod="openstack-operators/openstack-operator-index-tzrrq" Oct 06 08:35:45 crc kubenswrapper[4991]: I1006 08:35:45.783513 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tzrrq" Oct 06 08:35:45 crc kubenswrapper[4991]: I1006 08:35:45.952991 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lvqcn" event={"ID":"33544971-863a-4b28-a1e5-6eddea8e37c0","Type":"ContainerStarted","Data":"456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f"} Oct 06 08:35:45 crc kubenswrapper[4991]: I1006 08:35:45.953148 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-lvqcn" podUID="33544971-863a-4b28-a1e5-6eddea8e37c0" containerName="registry-server" containerID="cri-o://456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f" gracePeriod=2 Oct 06 08:35:45 crc kubenswrapper[4991]: I1006 08:35:45.976517 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lvqcn" podStartSLOduration=1.573413842 podStartE2EDuration="3.97649542s" podCreationTimestamp="2025-10-06 08:35:42 +0000 UTC" firstStartedPulling="2025-10-06 08:35:42.851101979 +0000 UTC m=+994.588852000" lastFinishedPulling="2025-10-06 08:35:45.254183557 +0000 UTC m=+996.991933578" observedRunningTime="2025-10-06 08:35:45.97026432 +0000 UTC m=+997.708014351" watchObservedRunningTime="2025-10-06 08:35:45.97649542 +0000 UTC m=+997.714245461" Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.018092 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tzrrq"] Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.347805 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lvqcn" Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.412784 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t44w\" (UniqueName: \"kubernetes.io/projected/33544971-863a-4b28-a1e5-6eddea8e37c0-kube-api-access-5t44w\") pod \"33544971-863a-4b28-a1e5-6eddea8e37c0\" (UID: \"33544971-863a-4b28-a1e5-6eddea8e37c0\") " Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.418468 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33544971-863a-4b28-a1e5-6eddea8e37c0-kube-api-access-5t44w" (OuterVolumeSpecName: "kube-api-access-5t44w") pod "33544971-863a-4b28-a1e5-6eddea8e37c0" (UID: "33544971-863a-4b28-a1e5-6eddea8e37c0"). InnerVolumeSpecName "kube-api-access-5t44w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.514079 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t44w\" (UniqueName: \"kubernetes.io/projected/33544971-863a-4b28-a1e5-6eddea8e37c0-kube-api-access-5t44w\") on node \"crc\" DevicePath \"\"" Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.963142 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tzrrq" event={"ID":"3fa73b60-5381-42ef-be66-f254fb2b80a1","Type":"ContainerStarted","Data":"16a6c19d9b9d58bb79eb3627579ba9930f339d5876e88abb06e6c931fdcbab24"} Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.964525 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tzrrq" event={"ID":"3fa73b60-5381-42ef-be66-f254fb2b80a1","Type":"ContainerStarted","Data":"4543d19fa1494a785ef9dc4138ead83c1aecb28383b210d0930019a166005856"} Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.965171 4991 generic.go:334] "Generic (PLEG): container finished" podID="33544971-863a-4b28-a1e5-6eddea8e37c0" containerID="456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f" exitCode=0 Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.965205 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lvqcn" event={"ID":"33544971-863a-4b28-a1e5-6eddea8e37c0","Type":"ContainerDied","Data":"456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f"} Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.965193 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lvqcn" Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.965228 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lvqcn" event={"ID":"33544971-863a-4b28-a1e5-6eddea8e37c0","Type":"ContainerDied","Data":"aa261c76e7321f47448ef450163dbe2e52d71ba6232a34c5956bb7d3237dd83f"} Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.965245 4991 scope.go:117] "RemoveContainer" containerID="456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f" Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.983204 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-tzrrq" podStartSLOduration=1.904526653 podStartE2EDuration="1.983178063s" podCreationTimestamp="2025-10-06 08:35:45 +0000 UTC" firstStartedPulling="2025-10-06 08:35:46.05960234 +0000 UTC m=+997.797352361" lastFinishedPulling="2025-10-06 08:35:46.13825375 +0000 UTC m=+997.876003771" observedRunningTime="2025-10-06 08:35:46.978405995 +0000 UTC m=+998.716156046" watchObservedRunningTime="2025-10-06 08:35:46.983178063 +0000 UTC m=+998.720928114" Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.991965 4991 scope.go:117] "RemoveContainer" containerID="456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f" Oct 06 08:35:46 crc kubenswrapper[4991]: E1006 08:35:46.992485 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f\": container with ID starting with 456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f not found: ID does not exist" containerID="456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f" Oct 06 08:35:46 crc kubenswrapper[4991]: I1006 08:35:46.992518 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f"} err="failed to get container status \"456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f\": rpc error: code = NotFound desc = could not find container \"456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f\": container with ID starting with 456f6924f9a1a0ccd4626227b9f8821e6fc0bd694f49a4c3686e644cfa46300f not found: ID does not exist" Oct 06 08:35:47 crc kubenswrapper[4991]: I1006 08:35:47.014987 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-lvqcn"] Oct 06 08:35:47 crc kubenswrapper[4991]: I1006 08:35:47.020699 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-lvqcn"] Oct 06 08:35:47 crc kubenswrapper[4991]: I1006 08:35:47.255102 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33544971-863a-4b28-a1e5-6eddea8e37c0" path="/var/lib/kubelet/pods/33544971-863a-4b28-a1e5-6eddea8e37c0/volumes" Oct 06 08:35:55 crc kubenswrapper[4991]: I1006 08:35:55.784368 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-tzrrq" Oct 06 08:35:55 crc kubenswrapper[4991]: I1006 08:35:55.785061 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-tzrrq" Oct 06 08:35:55 crc kubenswrapper[4991]: I1006 08:35:55.815191 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-tzrrq" Oct 06 08:35:56 crc kubenswrapper[4991]: I1006 08:35:56.073993 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-tzrrq" Oct 06 08:35:58 crc kubenswrapper[4991]: I1006 08:35:58.901713 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh"] Oct 06 08:35:58 crc kubenswrapper[4991]: E1006 08:35:58.902194 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33544971-863a-4b28-a1e5-6eddea8e37c0" containerName="registry-server" Oct 06 08:35:58 crc kubenswrapper[4991]: I1006 08:35:58.902206 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="33544971-863a-4b28-a1e5-6eddea8e37c0" containerName="registry-server" Oct 06 08:35:58 crc kubenswrapper[4991]: I1006 08:35:58.902332 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="33544971-863a-4b28-a1e5-6eddea8e37c0" containerName="registry-server" Oct 06 08:35:58 crc kubenswrapper[4991]: I1006 08:35:58.903128 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:35:58 crc kubenswrapper[4991]: I1006 08:35:58.904923 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-xlhb6" Oct 06 08:35:58 crc kubenswrapper[4991]: I1006 08:35:58.913169 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh"] Oct 06 08:35:58 crc kubenswrapper[4991]: I1006 08:35:58.998267 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-bundle\") pod \"40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh\" (UID: \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\") " pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:35:58 crc kubenswrapper[4991]: I1006 08:35:58.998420 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-util\") pod \"40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh\" (UID: \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\") " pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:35:58 crc kubenswrapper[4991]: I1006 08:35:58.998480 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dklm7\" (UniqueName: \"kubernetes.io/projected/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-kube-api-access-dklm7\") pod \"40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh\" (UID: \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\") " pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:35:59 crc kubenswrapper[4991]: I1006 08:35:59.100123 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-util\") pod \"40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh\" (UID: \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\") " pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:35:59 crc kubenswrapper[4991]: I1006 08:35:59.100209 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dklm7\" (UniqueName: \"kubernetes.io/projected/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-kube-api-access-dklm7\") pod \"40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh\" (UID: \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\") " pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:35:59 crc kubenswrapper[4991]: I1006 08:35:59.100364 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-bundle\") pod \"40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh\" (UID: \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\") " pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:35:59 crc kubenswrapper[4991]: I1006 08:35:59.100914 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-bundle\") pod \"40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh\" (UID: \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\") " pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:35:59 crc kubenswrapper[4991]: I1006 08:35:59.101136 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-util\") pod \"40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh\" (UID: \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\") " pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:35:59 crc kubenswrapper[4991]: I1006 08:35:59.128343 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dklm7\" (UniqueName: \"kubernetes.io/projected/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-kube-api-access-dklm7\") pod \"40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh\" (UID: \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\") " pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:35:59 crc kubenswrapper[4991]: I1006 08:35:59.240756 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:35:59 crc kubenswrapper[4991]: I1006 08:35:59.723602 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh"] Oct 06 08:35:59 crc kubenswrapper[4991]: W1006 08:35:59.738166 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55afad66_4c00_4f7b_bb4a_7cc0eb6c6742.slice/crio-3be6bc918846d7753363d9c8dd6bc1407d5dfadaa28486bddf95155f79487d59 WatchSource:0}: Error finding container 3be6bc918846d7753363d9c8dd6bc1407d5dfadaa28486bddf95155f79487d59: Status 404 returned error can't find the container with id 3be6bc918846d7753363d9c8dd6bc1407d5dfadaa28486bddf95155f79487d59 Oct 06 08:36:00 crc kubenswrapper[4991]: I1006 08:36:00.066728 4991 generic.go:334] "Generic (PLEG): container finished" podID="55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" containerID="161513d65156eb8c311083b844a5c29a125c6423de42f73f243da14c596423ef" exitCode=0 Oct 06 08:36:00 crc kubenswrapper[4991]: I1006 08:36:00.066833 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" event={"ID":"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742","Type":"ContainerDied","Data":"161513d65156eb8c311083b844a5c29a125c6423de42f73f243da14c596423ef"} Oct 06 08:36:00 crc kubenswrapper[4991]: I1006 08:36:00.067096 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" event={"ID":"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742","Type":"ContainerStarted","Data":"3be6bc918846d7753363d9c8dd6bc1407d5dfadaa28486bddf95155f79487d59"} Oct 06 08:36:01 crc kubenswrapper[4991]: I1006 08:36:01.078196 4991 generic.go:334] "Generic (PLEG): container finished" podID="55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" containerID="066604ae7646b9852bb973639aba8a5ea1a847f929f77aad792d617425d9c9a0" exitCode=0 Oct 06 08:36:01 crc kubenswrapper[4991]: I1006 08:36:01.078335 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" event={"ID":"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742","Type":"ContainerDied","Data":"066604ae7646b9852bb973639aba8a5ea1a847f929f77aad792d617425d9c9a0"} Oct 06 08:36:02 crc kubenswrapper[4991]: I1006 08:36:02.090952 4991 generic.go:334] "Generic (PLEG): container finished" podID="55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" containerID="5ee81e53a7aa3f4ed74101a347cb94ecfe7ca0ee0c33dc653562f94b4f79c502" exitCode=0 Oct 06 08:36:02 crc kubenswrapper[4991]: I1006 08:36:02.091021 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" event={"ID":"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742","Type":"ContainerDied","Data":"5ee81e53a7aa3f4ed74101a347cb94ecfe7ca0ee0c33dc653562f94b4f79c502"} Oct 06 08:36:03 crc kubenswrapper[4991]: I1006 08:36:03.445092 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:36:03 crc kubenswrapper[4991]: I1006 08:36:03.563479 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-util\") pod \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\" (UID: \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\") " Oct 06 08:36:03 crc kubenswrapper[4991]: I1006 08:36:03.563566 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-bundle\") pod \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\" (UID: \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\") " Oct 06 08:36:03 crc kubenswrapper[4991]: I1006 08:36:03.563683 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dklm7\" (UniqueName: \"kubernetes.io/projected/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-kube-api-access-dklm7\") pod \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\" (UID: \"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742\") " Oct 06 08:36:03 crc kubenswrapper[4991]: I1006 08:36:03.564402 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-bundle" (OuterVolumeSpecName: "bundle") pod "55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" (UID: "55afad66-4c00-4f7b-bb4a-7cc0eb6c6742"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:36:03 crc kubenswrapper[4991]: I1006 08:36:03.573864 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-kube-api-access-dklm7" (OuterVolumeSpecName: "kube-api-access-dklm7") pod "55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" (UID: "55afad66-4c00-4f7b-bb4a-7cc0eb6c6742"). InnerVolumeSpecName "kube-api-access-dklm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:36:03 crc kubenswrapper[4991]: I1006 08:36:03.594822 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-util" (OuterVolumeSpecName: "util") pod "55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" (UID: "55afad66-4c00-4f7b-bb4a-7cc0eb6c6742"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:36:03 crc kubenswrapper[4991]: I1006 08:36:03.665131 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dklm7\" (UniqueName: \"kubernetes.io/projected/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-kube-api-access-dklm7\") on node \"crc\" DevicePath \"\"" Oct 06 08:36:03 crc kubenswrapper[4991]: I1006 08:36:03.665197 4991 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-util\") on node \"crc\" DevicePath \"\"" Oct 06 08:36:03 crc kubenswrapper[4991]: I1006 08:36:03.665224 4991 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55afad66-4c00-4f7b-bb4a-7cc0eb6c6742-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:36:04 crc kubenswrapper[4991]: I1006 08:36:04.110284 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" event={"ID":"55afad66-4c00-4f7b-bb4a-7cc0eb6c6742","Type":"ContainerDied","Data":"3be6bc918846d7753363d9c8dd6bc1407d5dfadaa28486bddf95155f79487d59"} Oct 06 08:36:04 crc kubenswrapper[4991]: I1006 08:36:04.110373 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3be6bc918846d7753363d9c8dd6bc1407d5dfadaa28486bddf95155f79487d59" Oct 06 08:36:04 crc kubenswrapper[4991]: I1006 08:36:04.110418 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh" Oct 06 08:36:08 crc kubenswrapper[4991]: I1006 08:36:08.502691 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg"] Oct 06 08:36:08 crc kubenswrapper[4991]: E1006 08:36:08.503577 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" containerName="pull" Oct 06 08:36:08 crc kubenswrapper[4991]: I1006 08:36:08.503594 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" containerName="pull" Oct 06 08:36:08 crc kubenswrapper[4991]: E1006 08:36:08.503611 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" containerName="util" Oct 06 08:36:08 crc kubenswrapper[4991]: I1006 08:36:08.503618 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" containerName="util" Oct 06 08:36:08 crc kubenswrapper[4991]: E1006 08:36:08.503634 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" containerName="extract" Oct 06 08:36:08 crc kubenswrapper[4991]: I1006 08:36:08.503642 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" containerName="extract" Oct 06 08:36:08 crc kubenswrapper[4991]: I1006 08:36:08.503772 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="55afad66-4c00-4f7b-bb4a-7cc0eb6c6742" containerName="extract" Oct 06 08:36:08 crc kubenswrapper[4991]: I1006 08:36:08.504528 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg" Oct 06 08:36:08 crc kubenswrapper[4991]: I1006 08:36:08.506816 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-l7htm" Oct 06 08:36:08 crc kubenswrapper[4991]: I1006 08:36:08.521406 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5gct\" (UniqueName: \"kubernetes.io/projected/efe6525f-b400-4474-b6c6-d26c4ab8a43c-kube-api-access-w5gct\") pod \"openstack-operator-controller-operator-856746bff4-lbshg\" (UID: \"efe6525f-b400-4474-b6c6-d26c4ab8a43c\") " pod="openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg" Oct 06 08:36:08 crc kubenswrapper[4991]: I1006 08:36:08.525795 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg"] Oct 06 08:36:08 crc kubenswrapper[4991]: I1006 08:36:08.622833 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5gct\" (UniqueName: \"kubernetes.io/projected/efe6525f-b400-4474-b6c6-d26c4ab8a43c-kube-api-access-w5gct\") pod \"openstack-operator-controller-operator-856746bff4-lbshg\" (UID: \"efe6525f-b400-4474-b6c6-d26c4ab8a43c\") " pod="openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg" Oct 06 08:36:08 crc kubenswrapper[4991]: I1006 08:36:08.641316 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5gct\" (UniqueName: \"kubernetes.io/projected/efe6525f-b400-4474-b6c6-d26c4ab8a43c-kube-api-access-w5gct\") pod \"openstack-operator-controller-operator-856746bff4-lbshg\" (UID: \"efe6525f-b400-4474-b6c6-d26c4ab8a43c\") " pod="openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg" Oct 06 08:36:08 crc kubenswrapper[4991]: I1006 08:36:08.825269 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg" Oct 06 08:36:09 crc kubenswrapper[4991]: I1006 08:36:09.269153 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg"] Oct 06 08:36:09 crc kubenswrapper[4991]: W1006 08:36:09.273785 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefe6525f_b400_4474_b6c6_d26c4ab8a43c.slice/crio-59e1bca08b11e8d9d02c476b73c542e0d33f6798715a415af7b28f12ceaf07ac WatchSource:0}: Error finding container 59e1bca08b11e8d9d02c476b73c542e0d33f6798715a415af7b28f12ceaf07ac: Status 404 returned error can't find the container with id 59e1bca08b11e8d9d02c476b73c542e0d33f6798715a415af7b28f12ceaf07ac Oct 06 08:36:10 crc kubenswrapper[4991]: I1006 08:36:10.151376 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg" event={"ID":"efe6525f-b400-4474-b6c6-d26c4ab8a43c","Type":"ContainerStarted","Data":"59e1bca08b11e8d9d02c476b73c542e0d33f6798715a415af7b28f12ceaf07ac"} Oct 06 08:36:13 crc kubenswrapper[4991]: I1006 08:36:13.171722 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg" event={"ID":"efe6525f-b400-4474-b6c6-d26c4ab8a43c","Type":"ContainerStarted","Data":"780437ae96e30021e767531d022de04b2a14c4531474af7acaf8c10d2281f07e"} Oct 06 08:36:16 crc kubenswrapper[4991]: I1006 08:36:16.192448 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg" event={"ID":"efe6525f-b400-4474-b6c6-d26c4ab8a43c","Type":"ContainerStarted","Data":"ed0dd74ba8f6ff58ccb88350d0e451cf7703190c55e035f7df8984777171c8fb"} Oct 06 08:36:16 crc kubenswrapper[4991]: I1006 08:36:16.192931 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg" Oct 06 08:36:16 crc kubenswrapper[4991]: I1006 08:36:16.260042 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg" podStartSLOduration=2.304983453 podStartE2EDuration="8.260005404s" podCreationTimestamp="2025-10-06 08:36:08 +0000 UTC" firstStartedPulling="2025-10-06 08:36:09.27610976 +0000 UTC m=+1021.013859781" lastFinishedPulling="2025-10-06 08:36:15.231131711 +0000 UTC m=+1026.968881732" observedRunningTime="2025-10-06 08:36:16.245920657 +0000 UTC m=+1027.983670718" watchObservedRunningTime="2025-10-06 08:36:16.260005404 +0000 UTC m=+1027.997755475" Oct 06 08:36:17 crc kubenswrapper[4991]: I1006 08:36:17.204553 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-856746bff4-lbshg" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.251121 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.252937 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.256048 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-dnwnl" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.259076 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.260207 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.261908 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-k5l9g" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.265716 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckhxb\" (UniqueName: \"kubernetes.io/projected/605ba4cf-892d-451c-af8d-a6863c67898d-kube-api-access-ckhxb\") pod \"barbican-operator-controller-manager-5f7c849b98-jlsb9\" (UID: \"605ba4cf-892d-451c-af8d-a6863c67898d\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.271971 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.276445 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.282879 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.283864 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.287613 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-hh45z" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.292805 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.297478 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.303188 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-tr849" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.309352 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.314028 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.369287 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f2hq\" (UniqueName: \"kubernetes.io/projected/a69a6896-7855-4b15-b0b9-e26f87ad2864-kube-api-access-8f2hq\") pod \"glance-operator-controller-manager-5568b5d68-25vl9\" (UID: \"a69a6896-7855-4b15-b0b9-e26f87ad2864\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.369423 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn2nq\" (UniqueName: \"kubernetes.io/projected/a85c6bdb-d40d-428f-8f1e-0001b8dd34f7-kube-api-access-kn2nq\") pod \"cinder-operator-controller-manager-7d4d4f8d-wrsh9\" (UID: \"a85c6bdb-d40d-428f-8f1e-0001b8dd34f7\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.369463 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckhxb\" (UniqueName: \"kubernetes.io/projected/605ba4cf-892d-451c-af8d-a6863c67898d-kube-api-access-ckhxb\") pod \"barbican-operator-controller-manager-5f7c849b98-jlsb9\" (UID: \"605ba4cf-892d-451c-af8d-a6863c67898d\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.369499 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9679t\" (UniqueName: \"kubernetes.io/projected/ea26b29a-2a8d-4f43-8471-8f875d278b8f-kube-api-access-9679t\") pod \"designate-operator-controller-manager-75dfd9b554-nsr9g\" (UID: \"ea26b29a-2a8d-4f43-8471-8f875d278b8f\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.373139 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.374438 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.379288 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-qc54l" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.394981 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.396237 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.408437 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-2sfs6" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.408969 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckhxb\" (UniqueName: \"kubernetes.io/projected/605ba4cf-892d-451c-af8d-a6863c67898d-kube-api-access-ckhxb\") pod \"barbican-operator-controller-manager-5f7c849b98-jlsb9\" (UID: \"605ba4cf-892d-451c-af8d-a6863c67898d\") " pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.409046 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.413568 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.436802 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.438381 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.447106 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.447802 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-6qzm6" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.448064 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.449358 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-h2655"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.450483 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-h2655" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.456747 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-q8tsv" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.471381 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f2hq\" (UniqueName: \"kubernetes.io/projected/a69a6896-7855-4b15-b0b9-e26f87ad2864-kube-api-access-8f2hq\") pod \"glance-operator-controller-manager-5568b5d68-25vl9\" (UID: \"a69a6896-7855-4b15-b0b9-e26f87ad2864\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.472425 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b78f6\" (UniqueName: \"kubernetes.io/projected/38df1b74-dd97-43b9-a172-93ca631f8467-kube-api-access-b78f6\") pod \"ironic-operator-controller-manager-699b87f775-h2655\" (UID: \"38df1b74-dd97-43b9-a172-93ca631f8467\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-h2655" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.472494 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25827c7f-a146-4c7c-900f-670c747d6a15-cert\") pod \"infra-operator-controller-manager-658588b8c9-phjxk\" (UID: \"25827c7f-a146-4c7c-900f-670c747d6a15\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.472545 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fglng\" (UniqueName: \"kubernetes.io/projected/9315f646-15d5-4638-8f93-2b7ee013a548-kube-api-access-fglng\") pod \"horizon-operator-controller-manager-54876c876f-dnp2m\" (UID: \"9315f646-15d5-4638-8f93-2b7ee013a548\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.472576 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp7wn\" (UniqueName: \"kubernetes.io/projected/25827c7f-a146-4c7c-900f-670c747d6a15-kube-api-access-kp7wn\") pod \"infra-operator-controller-manager-658588b8c9-phjxk\" (UID: \"25827c7f-a146-4c7c-900f-670c747d6a15\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.472754 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn2nq\" (UniqueName: \"kubernetes.io/projected/a85c6bdb-d40d-428f-8f1e-0001b8dd34f7-kube-api-access-kn2nq\") pod \"cinder-operator-controller-manager-7d4d4f8d-wrsh9\" (UID: \"a85c6bdb-d40d-428f-8f1e-0001b8dd34f7\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.472833 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9679t\" (UniqueName: \"kubernetes.io/projected/ea26b29a-2a8d-4f43-8471-8f875d278b8f-kube-api-access-9679t\") pod \"designate-operator-controller-manager-75dfd9b554-nsr9g\" (UID: \"ea26b29a-2a8d-4f43-8471-8f875d278b8f\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.472869 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cp46\" (UniqueName: \"kubernetes.io/projected/2fcb483a-426d-49ef-9126-c5c8e4ff3a17-kube-api-access-7cp46\") pod \"heat-operator-controller-manager-8f58bc9db-94qvn\" (UID: \"2fcb483a-426d-49ef-9126-c5c8e4ff3a17\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.495869 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-h2655"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.504289 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9679t\" (UniqueName: \"kubernetes.io/projected/ea26b29a-2a8d-4f43-8471-8f875d278b8f-kube-api-access-9679t\") pod \"designate-operator-controller-manager-75dfd9b554-nsr9g\" (UID: \"ea26b29a-2a8d-4f43-8471-8f875d278b8f\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.509382 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn2nq\" (UniqueName: \"kubernetes.io/projected/a85c6bdb-d40d-428f-8f1e-0001b8dd34f7-kube-api-access-kn2nq\") pod \"cinder-operator-controller-manager-7d4d4f8d-wrsh9\" (UID: \"a85c6bdb-d40d-428f-8f1e-0001b8dd34f7\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.514338 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.515236 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.517415 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-7c54w" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.523975 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f2hq\" (UniqueName: \"kubernetes.io/projected/a69a6896-7855-4b15-b0b9-e26f87ad2864-kube-api-access-8f2hq\") pod \"glance-operator-controller-manager-5568b5d68-25vl9\" (UID: \"a69a6896-7855-4b15-b0b9-e26f87ad2864\") " pod="openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.529779 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.531002 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.533687 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vhtkm" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.550559 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.552128 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.554405 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vjh9h" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.562546 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.563834 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.565483 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-mg7bx" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.571570 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.572713 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.573199 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.574177 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5kc6\" (UniqueName: \"kubernetes.io/projected/db7afb25-5117-448c-aa10-aaad9f53b2d2-kube-api-access-z5kc6\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z\" (UID: \"db7afb25-5117-448c-aa10-aaad9f53b2d2\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.574262 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25827c7f-a146-4c7c-900f-670c747d6a15-cert\") pod \"infra-operator-controller-manager-658588b8c9-phjxk\" (UID: \"25827c7f-a146-4c7c-900f-670c747d6a15\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.574288 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fglng\" (UniqueName: \"kubernetes.io/projected/9315f646-15d5-4638-8f93-2b7ee013a548-kube-api-access-fglng\") pod \"horizon-operator-controller-manager-54876c876f-dnp2m\" (UID: \"9315f646-15d5-4638-8f93-2b7ee013a548\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.574350 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp7wn\" (UniqueName: \"kubernetes.io/projected/25827c7f-a146-4c7c-900f-670c747d6a15-kube-api-access-kp7wn\") pod \"infra-operator-controller-manager-658588b8c9-phjxk\" (UID: \"25827c7f-a146-4c7c-900f-670c747d6a15\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.574407 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cf7n\" (UniqueName: \"kubernetes.io/projected/54e0400f-2429-4520-9d49-82915611ff23-kube-api-access-2cf7n\") pod \"keystone-operator-controller-manager-655d88ccb9-v7mb5\" (UID: \"54e0400f-2429-4520-9d49-82915611ff23\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.574449 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cp46\" (UniqueName: \"kubernetes.io/projected/2fcb483a-426d-49ef-9126-c5c8e4ff3a17-kube-api-access-7cp46\") pod \"heat-operator-controller-manager-8f58bc9db-94qvn\" (UID: \"2fcb483a-426d-49ef-9126-c5c8e4ff3a17\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.574511 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b78f6\" (UniqueName: \"kubernetes.io/projected/38df1b74-dd97-43b9-a172-93ca631f8467-kube-api-access-b78f6\") pod \"ironic-operator-controller-manager-699b87f775-h2655\" (UID: \"38df1b74-dd97-43b9-a172-93ca631f8467\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-h2655" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.574531 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kr9t\" (UniqueName: \"kubernetes.io/projected/d187cd97-0019-488f-9f51-339b4ee5c699-kube-api-access-6kr9t\") pod \"neutron-operator-controller-manager-8d984cc4d-wzcnb\" (UID: \"d187cd97-0019-488f-9f51-339b4ee5c699\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb" Oct 06 08:36:33 crc kubenswrapper[4991]: E1006 08:36:33.574679 4991 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 06 08:36:33 crc kubenswrapper[4991]: E1006 08:36:33.574739 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25827c7f-a146-4c7c-900f-670c747d6a15-cert podName:25827c7f-a146-4c7c-900f-670c747d6a15 nodeName:}" failed. No retries permitted until 2025-10-06 08:36:34.074721119 +0000 UTC m=+1045.812471140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/25827c7f-a146-4c7c-900f-670c747d6a15-cert") pod "infra-operator-controller-manager-658588b8c9-phjxk" (UID: "25827c7f-a146-4c7c-900f-670c747d6a15") : secret "infra-operator-webhook-server-cert" not found Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.576551 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-wfn5w" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.580577 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.590433 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.603337 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.604438 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b78f6\" (UniqueName: \"kubernetes.io/projected/38df1b74-dd97-43b9-a172-93ca631f8467-kube-api-access-b78f6\") pod \"ironic-operator-controller-manager-699b87f775-h2655\" (UID: \"38df1b74-dd97-43b9-a172-93ca631f8467\") " pod="openstack-operators/ironic-operator-controller-manager-699b87f775-h2655" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.609660 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.618746 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.620167 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cp46\" (UniqueName: \"kubernetes.io/projected/2fcb483a-426d-49ef-9126-c5c8e4ff3a17-kube-api-access-7cp46\") pod \"heat-operator-controller-manager-8f58bc9db-94qvn\" (UID: \"2fcb483a-426d-49ef-9126-c5c8e4ff3a17\") " pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.623610 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.636529 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.637525 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.639502 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-sc7f7" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.648832 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fglng\" (UniqueName: \"kubernetes.io/projected/9315f646-15d5-4638-8f93-2b7ee013a548-kube-api-access-fglng\") pod \"horizon-operator-controller-manager-54876c876f-dnp2m\" (UID: \"9315f646-15d5-4638-8f93-2b7ee013a548\") " pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.650853 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp7wn\" (UniqueName: \"kubernetes.io/projected/25827c7f-a146-4c7c-900f-670c747d6a15-kube-api-access-kp7wn\") pod \"infra-operator-controller-manager-658588b8c9-phjxk\" (UID: \"25827c7f-a146-4c7c-900f-670c747d6a15\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.655620 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.670485 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.679765 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kr9t\" (UniqueName: \"kubernetes.io/projected/d187cd97-0019-488f-9f51-339b4ee5c699-kube-api-access-6kr9t\") pod \"neutron-operator-controller-manager-8d984cc4d-wzcnb\" (UID: \"d187cd97-0019-488f-9f51-339b4ee5c699\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.679810 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5kc6\" (UniqueName: \"kubernetes.io/projected/db7afb25-5117-448c-aa10-aaad9f53b2d2-kube-api-access-z5kc6\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z\" (UID: \"db7afb25-5117-448c-aa10-aaad9f53b2d2\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.679880 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sthcw\" (UniqueName: \"kubernetes.io/projected/ead03587-67bc-428d-a356-b00483d82715-kube-api-access-sthcw\") pod \"nova-operator-controller-manager-7c7fc454ff-qrvjh\" (UID: \"ead03587-67bc-428d-a356-b00483d82715\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.679900 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp9r7\" (UniqueName: \"kubernetes.io/projected/65e5f035-09a2-492c-8474-9b6441c345a3-kube-api-access-wp9r7\") pod \"manila-operator-controller-manager-65d89cfd9f-klmdp\" (UID: \"65e5f035-09a2-492c-8474-9b6441c345a3\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.679922 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n562q\" (UniqueName: \"kubernetes.io/projected/4f7769cb-2786-4ff1-8991-b8d073a47967-kube-api-access-n562q\") pod \"octavia-operator-controller-manager-7468f855d8-vf4gj\" (UID: \"4f7769cb-2786-4ff1-8991-b8d073a47967\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.679952 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cf7n\" (UniqueName: \"kubernetes.io/projected/54e0400f-2429-4520-9d49-82915611ff23-kube-api-access-2cf7n\") pod \"keystone-operator-controller-manager-655d88ccb9-v7mb5\" (UID: \"54e0400f-2429-4520-9d49-82915611ff23\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.691682 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.696767 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kr9t\" (UniqueName: \"kubernetes.io/projected/d187cd97-0019-488f-9f51-339b4ee5c699-kube-api-access-6kr9t\") pod \"neutron-operator-controller-manager-8d984cc4d-wzcnb\" (UID: \"d187cd97-0019-488f-9f51-339b4ee5c699\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.697134 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5kc6\" (UniqueName: \"kubernetes.io/projected/db7afb25-5117-448c-aa10-aaad9f53b2d2-kube-api-access-z5kc6\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z\" (UID: \"db7afb25-5117-448c-aa10-aaad9f53b2d2\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.699780 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cf7n\" (UniqueName: \"kubernetes.io/projected/54e0400f-2429-4520-9d49-82915611ff23-kube-api-access-2cf7n\") pod \"keystone-operator-controller-manager-655d88ccb9-v7mb5\" (UID: \"54e0400f-2429-4520-9d49-82915611ff23\") " pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.710136 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.710398 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.712687 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.716094 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-7xsc5" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.719098 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.745595 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.746869 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.748753 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.751636 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-t72w4" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.760362 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.761464 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.763231 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-n79dx" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.777778 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-h2655" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.780924 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sthcw\" (UniqueName: \"kubernetes.io/projected/ead03587-67bc-428d-a356-b00483d82715-kube-api-access-sthcw\") pod \"nova-operator-controller-manager-7c7fc454ff-qrvjh\" (UID: \"ead03587-67bc-428d-a356-b00483d82715\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.780968 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp9r7\" (UniqueName: \"kubernetes.io/projected/65e5f035-09a2-492c-8474-9b6441c345a3-kube-api-access-wp9r7\") pod \"manila-operator-controller-manager-65d89cfd9f-klmdp\" (UID: \"65e5f035-09a2-492c-8474-9b6441c345a3\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.780987 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n562q\" (UniqueName: \"kubernetes.io/projected/4f7769cb-2786-4ff1-8991-b8d073a47967-kube-api-access-n562q\") pod \"octavia-operator-controller-manager-7468f855d8-vf4gj\" (UID: \"4f7769cb-2786-4ff1-8991-b8d073a47967\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.781031 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h74q5\" (UniqueName: \"kubernetes.io/projected/1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f-kube-api-access-h74q5\") pod \"placement-operator-controller-manager-54689d9f88-mqlt4\" (UID: \"1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.781053 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x4ll\" (UniqueName: \"kubernetes.io/projected/3b87bd2d-2bf2-47a4-beba-7fa9e33b0a60-kube-api-access-4x4ll\") pod \"ovn-operator-controller-manager-579449c7d5-29dxq\" (UID: \"3b87bd2d-2bf2-47a4-beba-7fa9e33b0a60\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.781075 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtnvl\" (UniqueName: \"kubernetes.io/projected/94e75499-3011-4f29-9c5c-9a5cbea7d10f-kube-api-access-xtnvl\") pod \"swift-operator-controller-manager-6859f9b676-tq9r2\" (UID: \"94e75499-3011-4f29-9c5c-9a5cbea7d10f\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.783792 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.789108 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.790382 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.792034 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bq9p2" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.792915 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.815754 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.829682 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.838020 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n562q\" (UniqueName: \"kubernetes.io/projected/4f7769cb-2786-4ff1-8991-b8d073a47967-kube-api-access-n562q\") pod \"octavia-operator-controller-manager-7468f855d8-vf4gj\" (UID: \"4f7769cb-2786-4ff1-8991-b8d073a47967\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.839922 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sthcw\" (UniqueName: \"kubernetes.io/projected/ead03587-67bc-428d-a356-b00483d82715-kube-api-access-sthcw\") pod \"nova-operator-controller-manager-7c7fc454ff-qrvjh\" (UID: \"ead03587-67bc-428d-a356-b00483d82715\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.845940 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp9r7\" (UniqueName: \"kubernetes.io/projected/65e5f035-09a2-492c-8474-9b6441c345a3-kube-api-access-wp9r7\") pod \"manila-operator-controller-manager-65d89cfd9f-klmdp\" (UID: \"65e5f035-09a2-492c-8474-9b6441c345a3\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.851217 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.852702 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.855011 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.855806 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.857866 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-rg44h" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.866968 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.881117 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.882441 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.883100 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x4ll\" (UniqueName: \"kubernetes.io/projected/3b87bd2d-2bf2-47a4-beba-7fa9e33b0a60-kube-api-access-4x4ll\") pod \"ovn-operator-controller-manager-579449c7d5-29dxq\" (UID: \"3b87bd2d-2bf2-47a4-beba-7fa9e33b0a60\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.883149 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtnvl\" (UniqueName: \"kubernetes.io/projected/94e75499-3011-4f29-9c5c-9a5cbea7d10f-kube-api-access-xtnvl\") pod \"swift-operator-controller-manager-6859f9b676-tq9r2\" (UID: \"94e75499-3011-4f29-9c5c-9a5cbea7d10f\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.883184 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w258d\" (UniqueName: \"kubernetes.io/projected/4dd0865e-4068-4db1-a2ae-a854d69d0367-kube-api-access-w258d\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg\" (UID: \"4dd0865e-4068-4db1-a2ae-a854d69d0367\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.883205 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t78d\" (UniqueName: \"kubernetes.io/projected/a048641a-f1d7-4abc-a26f-1537cad412ec-kube-api-access-5t78d\") pod \"telemetry-operator-controller-manager-5d4d74dd89-nhl5c\" (UID: \"a048641a-f1d7-4abc-a26f-1537cad412ec\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.883232 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4dd0865e-4068-4db1-a2ae-a854d69d0367-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg\" (UID: \"4dd0865e-4068-4db1-a2ae-a854d69d0367\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.883359 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h74q5\" (UniqueName: \"kubernetes.io/projected/1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f-kube-api-access-h74q5\") pod \"placement-operator-controller-manager-54689d9f88-mqlt4\" (UID: \"1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.885903 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-d47k6" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.886268 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.891735 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.899992 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x4ll\" (UniqueName: \"kubernetes.io/projected/3b87bd2d-2bf2-47a4-beba-7fa9e33b0a60-kube-api-access-4x4ll\") pod \"ovn-operator-controller-manager-579449c7d5-29dxq\" (UID: \"3b87bd2d-2bf2-47a4-beba-7fa9e33b0a60\") " pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.903145 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h74q5\" (UniqueName: \"kubernetes.io/projected/1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f-kube-api-access-h74q5\") pod \"placement-operator-controller-manager-54689d9f88-mqlt4\" (UID: \"1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.908165 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtnvl\" (UniqueName: \"kubernetes.io/projected/94e75499-3011-4f29-9c5c-9a5cbea7d10f-kube-api-access-xtnvl\") pod \"swift-operator-controller-manager-6859f9b676-tq9r2\" (UID: \"94e75499-3011-4f29-9c5c-9a5cbea7d10f\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.912110 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.913212 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.913336 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.916209 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6p4s2" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.918652 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.947115 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.970501 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.980399 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.981140 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96"] Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.982448 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.989150 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w258d\" (UniqueName: \"kubernetes.io/projected/4dd0865e-4068-4db1-a2ae-a854d69d0367-kube-api-access-w258d\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg\" (UID: \"4dd0865e-4068-4db1-a2ae-a854d69d0367\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.989190 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t78d\" (UniqueName: \"kubernetes.io/projected/a048641a-f1d7-4abc-a26f-1537cad412ec-kube-api-access-5t78d\") pod \"telemetry-operator-controller-manager-5d4d74dd89-nhl5c\" (UID: \"a048641a-f1d7-4abc-a26f-1537cad412ec\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.989220 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cszgj\" (UniqueName: \"kubernetes.io/projected/04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f-kube-api-access-cszgj\") pod \"watcher-operator-controller-manager-6cbc6dd547-pghn2\" (UID: \"04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.989246 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4dd0865e-4068-4db1-a2ae-a854d69d0367-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg\" (UID: \"4dd0865e-4068-4db1-a2ae-a854d69d0367\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" Oct 06 08:36:33 crc kubenswrapper[4991]: I1006 08:36:33.989379 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t7gl\" (UniqueName: \"kubernetes.io/projected/b503d08d-eaa0-4987-93e1-099a4ea00450-kube-api-access-2t7gl\") pod \"test-operator-controller-manager-5cd5cb47d7-q5fql\" (UID: \"b503d08d-eaa0-4987-93e1-099a4ea00450\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" Oct 06 08:36:33 crc kubenswrapper[4991]: E1006 08:36:33.989824 4991 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 08:36:33 crc kubenswrapper[4991]: E1006 08:36:33.989878 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dd0865e-4068-4db1-a2ae-a854d69d0367-cert podName:4dd0865e-4068-4db1-a2ae-a854d69d0367 nodeName:}" failed. No retries permitted until 2025-10-06 08:36:34.489853364 +0000 UTC m=+1046.227603385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4dd0865e-4068-4db1-a2ae-a854d69d0367-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" (UID: "4dd0865e-4068-4db1-a2ae-a854d69d0367") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.000830 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-bsm8d" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.001010 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.002553 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.010862 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w258d\" (UniqueName: \"kubernetes.io/projected/4dd0865e-4068-4db1-a2ae-a854d69d0367-kube-api-access-w258d\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg\" (UID: \"4dd0865e-4068-4db1-a2ae-a854d69d0367\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.021661 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96"] Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.039072 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t78d\" (UniqueName: \"kubernetes.io/projected/a048641a-f1d7-4abc-a26f-1537cad412ec-kube-api-access-5t78d\") pod \"telemetry-operator-controller-manager-5d4d74dd89-nhl5c\" (UID: \"a048641a-f1d7-4abc-a26f-1537cad412ec\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.056674 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k"] Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.057974 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.064456 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-4ktq4" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.081442 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k"] Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.090560 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c317e377-2640-4117-a225-bb65849d42d0-cert\") pod \"openstack-operator-controller-manager-7c985df74c-bwr96\" (UID: \"c317e377-2640-4117-a225-bb65849d42d0\") " pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.091250 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t7gl\" (UniqueName: \"kubernetes.io/projected/b503d08d-eaa0-4987-93e1-099a4ea00450-kube-api-access-2t7gl\") pod \"test-operator-controller-manager-5cd5cb47d7-q5fql\" (UID: \"b503d08d-eaa0-4987-93e1-099a4ea00450\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.091407 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cszgj\" (UniqueName: \"kubernetes.io/projected/04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f-kube-api-access-cszgj\") pod \"watcher-operator-controller-manager-6cbc6dd547-pghn2\" (UID: \"04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.091479 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25827c7f-a146-4c7c-900f-670c747d6a15-cert\") pod \"infra-operator-controller-manager-658588b8c9-phjxk\" (UID: \"25827c7f-a146-4c7c-900f-670c747d6a15\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.091498 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbz5d\" (UniqueName: \"kubernetes.io/projected/c317e377-2640-4117-a225-bb65849d42d0-kube-api-access-gbz5d\") pod \"openstack-operator-controller-manager-7c985df74c-bwr96\" (UID: \"c317e377-2640-4117-a225-bb65849d42d0\") " pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.091580 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xgsb\" (UniqueName: \"kubernetes.io/projected/cd0a6c35-bd04-4ce8-8b61-94fe7ae169b6-kube-api-access-5xgsb\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k\" (UID: \"cd0a6c35-bd04-4ce8-8b61-94fe7ae169b6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.101817 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9"] Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.125638 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cszgj\" (UniqueName: \"kubernetes.io/projected/04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f-kube-api-access-cszgj\") pod \"watcher-operator-controller-manager-6cbc6dd547-pghn2\" (UID: \"04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.128213 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t7gl\" (UniqueName: \"kubernetes.io/projected/b503d08d-eaa0-4987-93e1-099a4ea00450-kube-api-access-2t7gl\") pod \"test-operator-controller-manager-5cd5cb47d7-q5fql\" (UID: \"b503d08d-eaa0-4987-93e1-099a4ea00450\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.139416 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25827c7f-a146-4c7c-900f-670c747d6a15-cert\") pod \"infra-operator-controller-manager-658588b8c9-phjxk\" (UID: \"25827c7f-a146-4c7c-900f-670c747d6a15\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.185717 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.192551 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xgsb\" (UniqueName: \"kubernetes.io/projected/cd0a6c35-bd04-4ce8-8b61-94fe7ae169b6-kube-api-access-5xgsb\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k\" (UID: \"cd0a6c35-bd04-4ce8-8b61-94fe7ae169b6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.192621 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c317e377-2640-4117-a225-bb65849d42d0-cert\") pod \"openstack-operator-controller-manager-7c985df74c-bwr96\" (UID: \"c317e377-2640-4117-a225-bb65849d42d0\") " pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.192715 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbz5d\" (UniqueName: \"kubernetes.io/projected/c317e377-2640-4117-a225-bb65849d42d0-kube-api-access-gbz5d\") pod \"openstack-operator-controller-manager-7c985df74c-bwr96\" (UID: \"c317e377-2640-4117-a225-bb65849d42d0\") " pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" Oct 06 08:36:34 crc kubenswrapper[4991]: E1006 08:36:34.192938 4991 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 06 08:36:34 crc kubenswrapper[4991]: E1006 08:36:34.192992 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c317e377-2640-4117-a225-bb65849d42d0-cert podName:c317e377-2640-4117-a225-bb65849d42d0 nodeName:}" failed. No retries permitted until 2025-10-06 08:36:34.692975518 +0000 UTC m=+1046.430725539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c317e377-2640-4117-a225-bb65849d42d0-cert") pod "openstack-operator-controller-manager-7c985df74c-bwr96" (UID: "c317e377-2640-4117-a225-bb65849d42d0") : secret "webhook-server-cert" not found Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.217994 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbz5d\" (UniqueName: \"kubernetes.io/projected/c317e377-2640-4117-a225-bb65849d42d0-kube-api-access-gbz5d\") pod \"openstack-operator-controller-manager-7c985df74c-bwr96\" (UID: \"c317e377-2640-4117-a225-bb65849d42d0\") " pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.219428 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xgsb\" (UniqueName: \"kubernetes.io/projected/cd0a6c35-bd04-4ce8-8b61-94fe7ae169b6-kube-api-access-5xgsb\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k\" (UID: \"cd0a6c35-bd04-4ce8-8b61-94fe7ae169b6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.314379 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.329528 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9"] Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.336313 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9" event={"ID":"605ba4cf-892d-451c-af8d-a6863c67898d","Type":"ContainerStarted","Data":"5f5f50948e9837ad44928751c11597a37252774aeb43a68855ef4e0ad3bb85bc"} Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.337776 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.339854 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9"] Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.360788 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.368921 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.405475 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.409713 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g"] Oct 06 08:36:34 crc kubenswrapper[4991]: W1006 08:36:34.448378 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda85c6bdb_d40d_428f_8f1e_0001b8dd34f7.slice/crio-67ec5751d9357795ec4b879cc9e8b2fbeb3197b048793385256fbacbceef7650 WatchSource:0}: Error finding container 67ec5751d9357795ec4b879cc9e8b2fbeb3197b048793385256fbacbceef7650: Status 404 returned error can't find the container with id 67ec5751d9357795ec4b879cc9e8b2fbeb3197b048793385256fbacbceef7650 Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.497109 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4dd0865e-4068-4db1-a2ae-a854d69d0367-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg\" (UID: \"4dd0865e-4068-4db1-a2ae-a854d69d0367\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" Oct 06 08:36:34 crc kubenswrapper[4991]: E1006 08:36:34.497275 4991 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 08:36:34 crc kubenswrapper[4991]: E1006 08:36:34.497326 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dd0865e-4068-4db1-a2ae-a854d69d0367-cert podName:4dd0865e-4068-4db1-a2ae-a854d69d0367 nodeName:}" failed. No retries permitted until 2025-10-06 08:36:35.497314534 +0000 UTC m=+1047.235064555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4dd0865e-4068-4db1-a2ae-a854d69d0367-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" (UID: "4dd0865e-4068-4db1-a2ae-a854d69d0367") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.701615 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c317e377-2640-4117-a225-bb65849d42d0-cert\") pod \"openstack-operator-controller-manager-7c985df74c-bwr96\" (UID: \"c317e377-2640-4117-a225-bb65849d42d0\") " pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.719443 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c317e377-2640-4117-a225-bb65849d42d0-cert\") pod \"openstack-operator-controller-manager-7c985df74c-bwr96\" (UID: \"c317e377-2640-4117-a225-bb65849d42d0\") " pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.867976 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5"] Oct 06 08:36:34 crc kubenswrapper[4991]: I1006 08:36:34.993094 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.020980 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.053587 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.197481 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.210178 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.222987 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.239009 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.286337 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-699b87f775-h2655"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.346768 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9" event={"ID":"a69a6896-7855-4b15-b0b9-e26f87ad2864","Type":"ContainerStarted","Data":"83f53d100d3848656f4869096e0385e942d0bdfccfbf82932f2c6a685bb2be22"} Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.348184 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m" event={"ID":"9315f646-15d5-4638-8f93-2b7ee013a548","Type":"ContainerStarted","Data":"38184c3edc24eeb61cedd6d1ad6d1d502530a91dc31912976cd4bac99bc2d925"} Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.349610 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn" event={"ID":"2fcb483a-426d-49ef-9126-c5c8e4ff3a17","Type":"ContainerStarted","Data":"df1fbd23fc098a2c5759e4f7406a86a46537e3524cb129a673cd688382e25ad7"} Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.350819 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9" event={"ID":"a85c6bdb-d40d-428f-8f1e-0001b8dd34f7","Type":"ContainerStarted","Data":"67ec5751d9357795ec4b879cc9e8b2fbeb3197b048793385256fbacbceef7650"} Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.351858 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-h2655" event={"ID":"38df1b74-dd97-43b9-a172-93ca631f8467","Type":"ContainerStarted","Data":"f7b88483b88c07837483da45dc1e5bb622e081284815c5177814f7da359df287"} Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.362117 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g" event={"ID":"ea26b29a-2a8d-4f43-8471-8f875d278b8f","Type":"ContainerStarted","Data":"7f23ee5e9633be6d320a9f8fb7669f96dd21ae86401bf2ae0f6895a9338f4fd0"} Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.373554 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z" event={"ID":"db7afb25-5117-448c-aa10-aaad9f53b2d2","Type":"ContainerStarted","Data":"bf729945c310980adf620556f1764e38592a1401c24913f4a26836d761f91ba5"} Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.378142 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.383174 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5" event={"ID":"54e0400f-2429-4520-9d49-82915611ff23","Type":"ContainerStarted","Data":"e18fb63f75d5f5e8023c89c723173efa67b44e4ed198a4312002b325e6c5dde2"} Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.383239 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.390631 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb" event={"ID":"d187cd97-0019-488f-9f51-339b4ee5c699","Type":"ContainerStarted","Data":"090264789ae76b6f096a77ef109a9218cd82916d3757a8b1dd6787a667f20240"} Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.392384 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.393790 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh" event={"ID":"ead03587-67bc-428d-a356-b00483d82715","Type":"ContainerStarted","Data":"0e19e834275f9f5cbb15e1703b7acb6681fd1e59634c2002f0346af2b562e34b"} Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.397023 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.426895 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2" event={"ID":"94e75499-3011-4f29-9c5c-9a5cbea7d10f","Type":"ContainerStarted","Data":"132a5cece4b5b2f74ca4f8a24d41cf2cc9d32ebdd567a57bddf2124e7dc06dd7"} Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.446158 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq"] Oct 06 08:36:35 crc kubenswrapper[4991]: W1006 08:36:35.464504 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda048641a_f1d7_4abc_a26f_1537cad412ec.slice/crio-506346d97d44864ed6ebd22dc5705108f9efaa5cf2cc40ff14b0dbfea2333b66 WatchSource:0}: Error finding container 506346d97d44864ed6ebd22dc5705108f9efaa5cf2cc40ff14b0dbfea2333b66: Status 404 returned error can't find the container with id 506346d97d44864ed6ebd22dc5705108f9efaa5cf2cc40ff14b0dbfea2333b66 Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.467069 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.509208 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql"] Oct 06 08:36:35 crc kubenswrapper[4991]: E1006 08:36:35.510840 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wp9r7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-65d89cfd9f-klmdp_openstack-operators(65e5f035-09a2-492c-8474-9b6441c345a3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 08:36:35 crc kubenswrapper[4991]: E1006 08:36:35.510975 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2t7gl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-q5fql_openstack-operators(b503d08d-eaa0-4987-93e1-099a4ea00450): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 08:36:35 crc kubenswrapper[4991]: E1006 08:36:35.511095 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h74q5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-54689d9f88-mqlt4_openstack-operators(1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.523221 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4dd0865e-4068-4db1-a2ae-a854d69d0367-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg\" (UID: \"4dd0865e-4068-4db1-a2ae-a854d69d0367\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.529859 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4dd0865e-4068-4db1-a2ae-a854d69d0367-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg\" (UID: \"4dd0865e-4068-4db1-a2ae-a854d69d0367\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.582807 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2"] Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.605467 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk"] Oct 06 08:36:35 crc kubenswrapper[4991]: W1006 08:36:35.618078 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25827c7f_a146_4c7c_900f_670c747d6a15.slice/crio-e1907bb3c5828d0d1f98a8a6508f5003589f825e9970797b0a736b474f325d37 WatchSource:0}: Error finding container e1907bb3c5828d0d1f98a8a6508f5003589f825e9970797b0a736b474f325d37: Status 404 returned error can't find the container with id e1907bb3c5828d0d1f98a8a6508f5003589f825e9970797b0a736b474f325d37 Oct 06 08:36:35 crc kubenswrapper[4991]: E1006 08:36:35.618100 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cszgj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6cbc6dd547-pghn2_openstack-operators(04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 08:36:35 crc kubenswrapper[4991]: E1006 08:36:35.633040 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kp7wn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-658588b8c9-phjxk_openstack-operators(25827c7f-a146-4c7c-900f-670c747d6a15): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.737426 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96"] Oct 06 08:36:35 crc kubenswrapper[4991]: W1006 08:36:35.746185 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc317e377_2640_4117_a225_bb65849d42d0.slice/crio-33490aa9ab935e6abb4c4b975575e3b26e4c6630e842d6c4e2f09a1a6498c3e5 WatchSource:0}: Error finding container 33490aa9ab935e6abb4c4b975575e3b26e4c6630e842d6c4e2f09a1a6498c3e5: Status 404 returned error can't find the container with id 33490aa9ab935e6abb4c4b975575e3b26e4c6630e842d6c4e2f09a1a6498c3e5 Oct 06 08:36:35 crc kubenswrapper[4991]: E1006 08:36:35.764402 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" podUID="b503d08d-eaa0-4987-93e1-099a4ea00450" Oct 06 08:36:35 crc kubenswrapper[4991]: E1006 08:36:35.786092 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" podUID="1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f" Oct 06 08:36:35 crc kubenswrapper[4991]: E1006 08:36:35.786148 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" podUID="65e5f035-09a2-492c-8474-9b6441c345a3" Oct 06 08:36:35 crc kubenswrapper[4991]: I1006 08:36:35.802815 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" Oct 06 08:36:35 crc kubenswrapper[4991]: E1006 08:36:35.887461 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" podUID="04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f" Oct 06 08:36:35 crc kubenswrapper[4991]: E1006 08:36:35.923879 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" podUID="25827c7f-a146-4c7c-900f-670c747d6a15" Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.444453 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" event={"ID":"04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f","Type":"ContainerStarted","Data":"d26361969e5fff08b75d0a2fada19666b7e37eff391fcbc6a4658d0254ba0afd"} Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.444508 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" event={"ID":"04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f","Type":"ContainerStarted","Data":"b17c6e4368b2c3e4d205babb36b3a802e6d7f8e71ed0d99e5f3bc2127d4d8261"} Oct 06 08:36:36 crc kubenswrapper[4991]: E1006 08:36:36.449931 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" podUID="04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f" Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.456507 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" event={"ID":"65e5f035-09a2-492c-8474-9b6441c345a3","Type":"ContainerStarted","Data":"25626d9adb873cf8bd0835a4eaaa743cd09b59c2f30cbaa0f4bbe25cdb35355a"} Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.458732 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" event={"ID":"65e5f035-09a2-492c-8474-9b6441c345a3","Type":"ContainerStarted","Data":"ca5637fd278af76f592e0b2ca0922c0718ca8ed46e2fd236eb7c4988fe1a2299"} Oct 06 08:36:36 crc kubenswrapper[4991]: E1006 08:36:36.475489 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757\\\"\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" podUID="65e5f035-09a2-492c-8474-9b6441c345a3" Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.476543 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj" event={"ID":"4f7769cb-2786-4ff1-8991-b8d073a47967","Type":"ContainerStarted","Data":"80b673373abe083e1cc7ae15c0187b33e849a5a488a3c2eff75b73726ad7bd30"} Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.497651 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" event={"ID":"c317e377-2640-4117-a225-bb65849d42d0","Type":"ContainerStarted","Data":"1134c149137ce3154f1e0f1dc798b7b4281eff5c72b2678647a5046a6bc5c3d0"} Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.497776 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" event={"ID":"c317e377-2640-4117-a225-bb65849d42d0","Type":"ContainerStarted","Data":"e576fa6923e601cdb19f91147ab3eb63046c0b244f541371c01fd4f619ed0c9a"} Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.497796 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" event={"ID":"c317e377-2640-4117-a225-bb65849d42d0","Type":"ContainerStarted","Data":"33490aa9ab935e6abb4c4b975575e3b26e4c6630e842d6c4e2f09a1a6498c3e5"} Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.498551 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.503781 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg"] Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.507631 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k" event={"ID":"cd0a6c35-bd04-4ce8-8b61-94fe7ae169b6","Type":"ContainerStarted","Data":"7f56ab8b10815c3feee13202efd39e1813466b4ed3416bc6d13ae93e6176205e"} Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.509811 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq" event={"ID":"3b87bd2d-2bf2-47a4-beba-7fa9e33b0a60","Type":"ContainerStarted","Data":"8f5729ba4b9a20645f7a97cb4a8e3c6511bbf170a3ec966daf0ef9f4985f7c3b"} Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.512948 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" event={"ID":"25827c7f-a146-4c7c-900f-670c747d6a15","Type":"ContainerStarted","Data":"3482658005fbea1090f1c3ee116f487c1bab74de87da1186e99fd87bd76bf972"} Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.512988 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" event={"ID":"25827c7f-a146-4c7c-900f-670c747d6a15","Type":"ContainerStarted","Data":"e1907bb3c5828d0d1f98a8a6508f5003589f825e9970797b0a736b474f325d37"} Oct 06 08:36:36 crc kubenswrapper[4991]: E1006 08:36:36.514536 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" podUID="25827c7f-a146-4c7c-900f-670c747d6a15" Oct 06 08:36:36 crc kubenswrapper[4991]: W1006 08:36:36.525845 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd0865e_4068_4db1_a2ae_a854d69d0367.slice/crio-0d27ec31bd5d6f107380ee1b926f471dc14135a09b83ea4943f31372b57272b8 WatchSource:0}: Error finding container 0d27ec31bd5d6f107380ee1b926f471dc14135a09b83ea4943f31372b57272b8: Status 404 returned error can't find the container with id 0d27ec31bd5d6f107380ee1b926f471dc14135a09b83ea4943f31372b57272b8 Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.539122 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" podStartSLOduration=3.5390952799999997 podStartE2EDuration="3.53909528s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:36:36.530766999 +0000 UTC m=+1048.268517020" watchObservedRunningTime="2025-10-06 08:36:36.53909528 +0000 UTC m=+1048.276845301" Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.542977 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" event={"ID":"1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f","Type":"ContainerStarted","Data":"2b8797bffa9f07822ee4485c93ffafa4ace3aa7aa9b715bda3991d4ceae4b706"} Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.543018 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" event={"ID":"1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f","Type":"ContainerStarted","Data":"e3fa98f6d3fcb8b8e64737b12d2286d28c0c7f14bee904314426f76c30080226"} Oct 06 08:36:36 crc kubenswrapper[4991]: E1006 08:36:36.560583 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" podUID="1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f" Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.577398 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" event={"ID":"b503d08d-eaa0-4987-93e1-099a4ea00450","Type":"ContainerStarted","Data":"471efa8ec8336643f3bf2ac76154fa706c370ad1edc327978ada932c43ba0079"} Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.577449 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" event={"ID":"b503d08d-eaa0-4987-93e1-099a4ea00450","Type":"ContainerStarted","Data":"c65f5b3f784ce0ddc52400393d2aa7a943fa08f947caaf3fb59aae1a429876cb"} Oct 06 08:36:36 crc kubenswrapper[4991]: I1006 08:36:36.578857 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c" event={"ID":"a048641a-f1d7-4abc-a26f-1537cad412ec","Type":"ContainerStarted","Data":"506346d97d44864ed6ebd22dc5705108f9efaa5cf2cc40ff14b0dbfea2333b66"} Oct 06 08:36:36 crc kubenswrapper[4991]: E1006 08:36:36.585375 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" podUID="b503d08d-eaa0-4987-93e1-099a4ea00450" Oct 06 08:36:37 crc kubenswrapper[4991]: I1006 08:36:37.591505 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" event={"ID":"4dd0865e-4068-4db1-a2ae-a854d69d0367","Type":"ContainerStarted","Data":"0d27ec31bd5d6f107380ee1b926f471dc14135a09b83ea4943f31372b57272b8"} Oct 06 08:36:37 crc kubenswrapper[4991]: E1006 08:36:37.593458 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:adc23c5fd1aece2b16dc8e22ceed628f9a719455e39d3f98c77544665c6749e1\\\"\"" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" podUID="1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f" Oct 06 08:36:37 crc kubenswrapper[4991]: E1006 08:36:37.593568 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" podUID="b503d08d-eaa0-4987-93e1-099a4ea00450" Oct 06 08:36:37 crc kubenswrapper[4991]: E1006 08:36:37.594358 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" podUID="25827c7f-a146-4c7c-900f-670c747d6a15" Oct 06 08:36:37 crc kubenswrapper[4991]: E1006 08:36:37.596702 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:063aae1458289d1090a77c74c2b978b9eb978b0e4062c399f0cb5434a8dd2757\\\"\"" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" podUID="65e5f035-09a2-492c-8474-9b6441c345a3" Oct 06 08:36:37 crc kubenswrapper[4991]: E1006 08:36:37.597140 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" podUID="04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f" Oct 06 08:36:44 crc kubenswrapper[4991]: I1006 08:36:44.998694 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7c985df74c-bwr96" Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.710030 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m" event={"ID":"9315f646-15d5-4638-8f93-2b7ee013a548","Type":"ContainerStarted","Data":"ec2fa7963010cc4ef0316105498e37ba5439f291624e17ee4ac0c6ddce2e6655"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.710280 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m" event={"ID":"9315f646-15d5-4638-8f93-2b7ee013a548","Type":"ContainerStarted","Data":"0e8e79af60f14673c5daa643fc7b2e5f1f2e8b0411bda1d33ab96056b2fad5a8"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.711154 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m" Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.722365 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5" event={"ID":"54e0400f-2429-4520-9d49-82915611ff23","Type":"ContainerStarted","Data":"6e1ed4b4600835023864d0ac4d78d77ba14f5cda6ac1f8138de3c08b06d15227"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.737724 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9" event={"ID":"605ba4cf-892d-451c-af8d-a6863c67898d","Type":"ContainerStarted","Data":"36ce1ed3022e64dd9af028a50b19b0c0516e1bb955ba8912b3aef4a5221706aa"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.739477 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh" event={"ID":"ead03587-67bc-428d-a356-b00483d82715","Type":"ContainerStarted","Data":"5e87fbc583033d58b3fb026d4268d5ee8cae776444cb8fa1c20af6ddb8fd4638"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.751083 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m" podStartSLOduration=3.406995027 podStartE2EDuration="14.751068999s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.078146683 +0000 UTC m=+1046.815896704" lastFinishedPulling="2025-10-06 08:36:46.422220655 +0000 UTC m=+1058.159970676" observedRunningTime="2025-10-06 08:36:47.749396992 +0000 UTC m=+1059.487147013" watchObservedRunningTime="2025-10-06 08:36:47.751068999 +0000 UTC m=+1059.488819020" Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.781928 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2" event={"ID":"94e75499-3011-4f29-9c5c-9a5cbea7d10f","Type":"ContainerStarted","Data":"ea2593cdc77dc33c11a952f9e4bfd59abc140df066dd4b84cdb7623a314c7861"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.783187 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g" event={"ID":"ea26b29a-2a8d-4f43-8471-8f875d278b8f","Type":"ContainerStarted","Data":"6bf4c163b188b114d1d1cdb1b5598d02e815d12ffd86a3327e96d3180ca1621e"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.783936 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g" Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.785021 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb" event={"ID":"d187cd97-0019-488f-9f51-339b4ee5c699","Type":"ContainerStarted","Data":"84cb2e9200bc4ba3fa300fa06b1eda512d5f59a6773d9f6ee060515591d40b5d"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.785044 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb" event={"ID":"d187cd97-0019-488f-9f51-339b4ee5c699","Type":"ContainerStarted","Data":"ced046b41361256d385b4c8659c6042e381b871b0dab12eac52d485ea2b67795"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.785395 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb" Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.786243 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" event={"ID":"4dd0865e-4068-4db1-a2ae-a854d69d0367","Type":"ContainerStarted","Data":"dca958e1547beadf717a830ba94510db84eb275e360ef60657a511b6aa48a7f5"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.791955 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c" event={"ID":"a048641a-f1d7-4abc-a26f-1537cad412ec","Type":"ContainerStarted","Data":"19d108b30b984130ad92317f23ab198cfa8b26c036aa9096d95c32ec6ea447e0"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.795205 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k" event={"ID":"cd0a6c35-bd04-4ce8-8b61-94fe7ae169b6","Type":"ContainerStarted","Data":"5868db0e68c0d25e0ceb15fa521bc7624d0bf453630336a9e360b577cefea54f"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.798945 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9" event={"ID":"a85c6bdb-d40d-428f-8f1e-0001b8dd34f7","Type":"ContainerStarted","Data":"711fd0ca232c94dfc49bd8123caee17ea60bcd6123baa219c0ec4bdf0a4d7c67"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.802450 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9" event={"ID":"a69a6896-7855-4b15-b0b9-e26f87ad2864","Type":"ContainerStarted","Data":"c375b6674feb0611ed0279f024369ec822ae34026e75a35db42ab9b15dee98e5"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.805558 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g" podStartSLOduration=2.93169293 podStartE2EDuration="14.805545814s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:34.494220215 +0000 UTC m=+1046.231970236" lastFinishedPulling="2025-10-06 08:36:46.368073099 +0000 UTC m=+1058.105823120" observedRunningTime="2025-10-06 08:36:47.803729424 +0000 UTC m=+1059.541479445" watchObservedRunningTime="2025-10-06 08:36:47.805545814 +0000 UTC m=+1059.543295835" Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.833767 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj" event={"ID":"4f7769cb-2786-4ff1-8991-b8d073a47967","Type":"ContainerStarted","Data":"a02bc4d988f4523647bfad23a349b6a8b6664625273914903a907b7fe8386504"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.834708 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb" podStartSLOduration=3.601426809 podStartE2EDuration="14.834691214s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.124745848 +0000 UTC m=+1046.862495869" lastFinishedPulling="2025-10-06 08:36:46.358010243 +0000 UTC m=+1058.095760274" observedRunningTime="2025-10-06 08:36:47.8338194 +0000 UTC m=+1059.571569421" watchObservedRunningTime="2025-10-06 08:36:47.834691214 +0000 UTC m=+1059.572441235" Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.842447 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z" event={"ID":"db7afb25-5117-448c-aa10-aaad9f53b2d2","Type":"ContainerStarted","Data":"04f5272c60489f8cf669a989d75347aee64e165fb464ae635d838f07d1c4c438"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.853180 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn" event={"ID":"2fcb483a-426d-49ef-9126-c5c8e4ff3a17","Type":"ContainerStarted","Data":"7479d50a33d5c858771bbe988d57a18e135690f85df17e773c560e9cb245aa21"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.864565 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-h2655" event={"ID":"38df1b74-dd97-43b9-a172-93ca631f8467","Type":"ContainerStarted","Data":"452d4fbe938c489a079edc45007a0ce9132e1eb7079ff58b2b4a56e6466238e0"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.865037 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-h2655" Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.869216 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k" podStartSLOduration=2.982093682 podStartE2EDuration="13.869199742s" podCreationTimestamp="2025-10-06 08:36:34 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.464466536 +0000 UTC m=+1047.202216557" lastFinishedPulling="2025-10-06 08:36:46.351572596 +0000 UTC m=+1058.089322617" observedRunningTime="2025-10-06 08:36:47.864467831 +0000 UTC m=+1059.602217852" watchObservedRunningTime="2025-10-06 08:36:47.869199742 +0000 UTC m=+1059.606949763" Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.877356 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq" event={"ID":"3b87bd2d-2bf2-47a4-beba-7fa9e33b0a60","Type":"ContainerStarted","Data":"358d6f178ad3290225c17bacf7e721c1674048b92e4a851483321ae785a0d28c"} Oct 06 08:36:47 crc kubenswrapper[4991]: I1006 08:36:47.895781 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-h2655" podStartSLOduration=3.859471972 podStartE2EDuration="14.89576402s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.314886827 +0000 UTC m=+1047.052636848" lastFinishedPulling="2025-10-06 08:36:46.351178875 +0000 UTC m=+1058.088928896" observedRunningTime="2025-10-06 08:36:47.885266502 +0000 UTC m=+1059.623016523" watchObservedRunningTime="2025-10-06 08:36:47.89576402 +0000 UTC m=+1059.633514041" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.885054 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c" event={"ID":"a048641a-f1d7-4abc-a26f-1537cad412ec","Type":"ContainerStarted","Data":"923ee20bc6f9a2c340d686faf7a729381cfbc72e3562404f6b24f26cc2f114b2"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.885480 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.887798 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh" event={"ID":"ead03587-67bc-428d-a356-b00483d82715","Type":"ContainerStarted","Data":"54df6f5e59236c5765ffd83d1196c77a997a372b1369475181cded1a3ffeca61"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.887912 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.889697 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5" event={"ID":"54e0400f-2429-4520-9d49-82915611ff23","Type":"ContainerStarted","Data":"5e4060a9f2f02ae671292420aefbceca4f530216874585e4ed27f55dbdf5e0cf"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.889846 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.891474 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9" event={"ID":"605ba4cf-892d-451c-af8d-a6863c67898d","Type":"ContainerStarted","Data":"56f11bdb761f275c82d27f9dffd5cf0249c466d724cd2cf211dc6d96e857c6ca"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.891634 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.893084 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn" event={"ID":"2fcb483a-426d-49ef-9126-c5c8e4ff3a17","Type":"ContainerStarted","Data":"6662bd197aa1457774c676382d9365ec7bdc158eabd4994224ef17f243507ba5"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.893152 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.894724 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-h2655" event={"ID":"38df1b74-dd97-43b9-a172-93ca631f8467","Type":"ContainerStarted","Data":"02a388d85a0c215a205ecb7e2443fe80ac789a7803ccb7dd64c11d176f720676"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.896214 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" event={"ID":"4dd0865e-4068-4db1-a2ae-a854d69d0367","Type":"ContainerStarted","Data":"59458d57dc45e366b719f4de48093e3d6622b44a9c0274163625938ff228137e"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.896931 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.898371 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj" event={"ID":"4f7769cb-2786-4ff1-8991-b8d073a47967","Type":"ContainerStarted","Data":"843136ee3e7923c0a8c29e254311e029c6a530a16ad54e713a2329a1758be922"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.898797 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.900678 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9" event={"ID":"a85c6bdb-d40d-428f-8f1e-0001b8dd34f7","Type":"ContainerStarted","Data":"c16363ab387d82770a4113234a4df8aec3ee8f84fbe379e4c90623e187acdcc3"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.906130 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq" event={"ID":"3b87bd2d-2bf2-47a4-beba-7fa9e33b0a60","Type":"ContainerStarted","Data":"a39276d0d8dd975298b514144e1c9f5edd6c0b613aa60971ed0a1a4a425410ed"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.907004 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.916371 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g" event={"ID":"ea26b29a-2a8d-4f43-8471-8f875d278b8f","Type":"ContainerStarted","Data":"d06096139e5f16cfb09133baeb0330d892f84a18e0f108f3a7258257e8002bbc"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.922729 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9" event={"ID":"a69a6896-7855-4b15-b0b9-e26f87ad2864","Type":"ContainerStarted","Data":"32731d3d26fe58388d61397fe2f92c39ac19b8314bef56a888eb8870e07c276f"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.923015 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.924955 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2" event={"ID":"94e75499-3011-4f29-9c5c-9a5cbea7d10f","Type":"ContainerStarted","Data":"f0ab7e0b3dfa61d21690460760801c34c77f5f35352b76c56262a3889f77408a"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.925582 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.927843 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z" event={"ID":"db7afb25-5117-448c-aa10-aaad9f53b2d2","Type":"ContainerStarted","Data":"b9dcae6b13e0366fe91eac355c5bfef296a734fc426b2a8e7bc3b3bd1a346dda"} Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.928376 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.951063 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" podStartSLOduration=6.000955376 podStartE2EDuration="15.951048233s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:36.544612569 +0000 UTC m=+1048.282362590" lastFinishedPulling="2025-10-06 08:36:46.494705416 +0000 UTC m=+1058.232455447" observedRunningTime="2025-10-06 08:36:48.948533514 +0000 UTC m=+1060.686283535" watchObservedRunningTime="2025-10-06 08:36:48.951048233 +0000 UTC m=+1060.688798254" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.954619 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c" podStartSLOduration=5.061546945 podStartE2EDuration="15.95460775s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.498231911 +0000 UTC m=+1047.235981932" lastFinishedPulling="2025-10-06 08:36:46.391292716 +0000 UTC m=+1058.129042737" observedRunningTime="2025-10-06 08:36:48.914855929 +0000 UTC m=+1060.652605950" watchObservedRunningTime="2025-10-06 08:36:48.95460775 +0000 UTC m=+1060.692357761" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.989608 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh" podStartSLOduration=4.839807771 podStartE2EDuration="15.989591671s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.21831385 +0000 UTC m=+1046.956063871" lastFinishedPulling="2025-10-06 08:36:46.36809774 +0000 UTC m=+1058.105847771" observedRunningTime="2025-10-06 08:36:48.988185602 +0000 UTC m=+1060.725935623" watchObservedRunningTime="2025-10-06 08:36:48.989591671 +0000 UTC m=+1060.727341692" Oct 06 08:36:48 crc kubenswrapper[4991]: I1006 08:36:48.992367 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj" podStartSLOduration=5.086370283 podStartE2EDuration="15.992359597s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.446795745 +0000 UTC m=+1047.184545766" lastFinishedPulling="2025-10-06 08:36:46.352785059 +0000 UTC m=+1058.090535080" observedRunningTime="2025-10-06 08:36:48.969838059 +0000 UTC m=+1060.707588100" watchObservedRunningTime="2025-10-06 08:36:48.992359597 +0000 UTC m=+1060.730109618" Oct 06 08:36:49 crc kubenswrapper[4991]: I1006 08:36:49.022141 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn" podStartSLOduration=4.838949475 podStartE2EDuration="16.022121244s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.245198265 +0000 UTC m=+1046.982948286" lastFinishedPulling="2025-10-06 08:36:46.428370024 +0000 UTC m=+1058.166120055" observedRunningTime="2025-10-06 08:36:49.02016232 +0000 UTC m=+1060.757912341" watchObservedRunningTime="2025-10-06 08:36:49.022121244 +0000 UTC m=+1060.759871265" Oct 06 08:36:49 crc kubenswrapper[4991]: I1006 08:36:49.040318 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9" podStartSLOduration=4.111976824 podStartE2EDuration="16.040274613s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:34.493962217 +0000 UTC m=+1046.231712238" lastFinishedPulling="2025-10-06 08:36:46.422259976 +0000 UTC m=+1058.160010027" observedRunningTime="2025-10-06 08:36:49.034984557 +0000 UTC m=+1060.772734598" watchObservedRunningTime="2025-10-06 08:36:49.040274613 +0000 UTC m=+1060.778024634" Oct 06 08:36:49 crc kubenswrapper[4991]: I1006 08:36:49.059477 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq" podStartSLOduration=5.120712716 podStartE2EDuration="16.059451889s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.453097208 +0000 UTC m=+1047.190847229" lastFinishedPulling="2025-10-06 08:36:46.391836391 +0000 UTC m=+1058.129586402" observedRunningTime="2025-10-06 08:36:49.058559994 +0000 UTC m=+1060.796310015" watchObservedRunningTime="2025-10-06 08:36:49.059451889 +0000 UTC m=+1060.797201900" Oct 06 08:36:49 crc kubenswrapper[4991]: I1006 08:36:49.100028 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5" podStartSLOduration=4.603274048 podStartE2EDuration="16.100005242s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:34.894141051 +0000 UTC m=+1046.631891072" lastFinishedPulling="2025-10-06 08:36:46.390872245 +0000 UTC m=+1058.128622266" observedRunningTime="2025-10-06 08:36:49.09519155 +0000 UTC m=+1060.832941571" watchObservedRunningTime="2025-10-06 08:36:49.100005242 +0000 UTC m=+1060.837755263" Oct 06 08:36:49 crc kubenswrapper[4991]: I1006 08:36:49.100852 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9" podStartSLOduration=3.858662682 podStartE2EDuration="16.100844685s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:34.107960743 +0000 UTC m=+1045.845710764" lastFinishedPulling="2025-10-06 08:36:46.350142746 +0000 UTC m=+1058.087892767" observedRunningTime="2025-10-06 08:36:49.083061477 +0000 UTC m=+1060.820811508" watchObservedRunningTime="2025-10-06 08:36:49.100844685 +0000 UTC m=+1060.838594706" Oct 06 08:36:49 crc kubenswrapper[4991]: I1006 08:36:49.157659 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2" podStartSLOduration=4.962239459 podStartE2EDuration="16.157636965s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.230963054 +0000 UTC m=+1046.968713075" lastFinishedPulling="2025-10-06 08:36:46.42636055 +0000 UTC m=+1058.164110581" observedRunningTime="2025-10-06 08:36:49.12543195 +0000 UTC m=+1060.863182001" watchObservedRunningTime="2025-10-06 08:36:49.157636965 +0000 UTC m=+1060.895386986" Oct 06 08:36:49 crc kubenswrapper[4991]: I1006 08:36:49.160016 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9" podStartSLOduration=4.215551651 podStartE2EDuration="16.16000206s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:34.447029372 +0000 UTC m=+1046.184779393" lastFinishedPulling="2025-10-06 08:36:46.391479771 +0000 UTC m=+1058.129229802" observedRunningTime="2025-10-06 08:36:49.151605959 +0000 UTC m=+1060.889355990" watchObservedRunningTime="2025-10-06 08:36:49.16000206 +0000 UTC m=+1060.897752081" Oct 06 08:36:49 crc kubenswrapper[4991]: I1006 08:36:49.172468 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z" podStartSLOduration=5.005601697 podStartE2EDuration="16.172445891s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.249600853 +0000 UTC m=+1046.987350874" lastFinishedPulling="2025-10-06 08:36:46.416445047 +0000 UTC m=+1058.154195068" observedRunningTime="2025-10-06 08:36:49.168373819 +0000 UTC m=+1060.906123850" watchObservedRunningTime="2025-10-06 08:36:49.172445891 +0000 UTC m=+1060.910195922" Oct 06 08:36:49 crc kubenswrapper[4991]: I1006 08:36:49.937696 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9" Oct 06 08:36:52 crc kubenswrapper[4991]: I1006 08:36:52.960045 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" event={"ID":"04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f","Type":"ContainerStarted","Data":"7181e55b5149255a4c9bbda9769c37ae75c60aaa422a22ab87dda5a75caabff5"} Oct 06 08:36:52 crc kubenswrapper[4991]: I1006 08:36:52.960779 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" Oct 06 08:36:52 crc kubenswrapper[4991]: I1006 08:36:52.989152 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" podStartSLOduration=3.635158133 podStartE2EDuration="19.989133315s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.617984567 +0000 UTC m=+1047.355734588" lastFinishedPulling="2025-10-06 08:36:51.971959719 +0000 UTC m=+1063.709709770" observedRunningTime="2025-10-06 08:36:52.984771836 +0000 UTC m=+1064.722521857" watchObservedRunningTime="2025-10-06 08:36:52.989133315 +0000 UTC m=+1064.726883326" Oct 06 08:36:53 crc kubenswrapper[4991]: I1006 08:36:53.576382 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5f7c849b98-jlsb9" Oct 06 08:36:53 crc kubenswrapper[4991]: I1006 08:36:53.583322 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-wrsh9" Oct 06 08:36:53 crc kubenswrapper[4991]: I1006 08:36:53.619198 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-nsr9g" Oct 06 08:36:53 crc kubenswrapper[4991]: I1006 08:36:53.626742 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5568b5d68-25vl9" Oct 06 08:36:53 crc kubenswrapper[4991]: I1006 08:36:53.714110 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-8f58bc9db-94qvn" Oct 06 08:36:53 crc kubenswrapper[4991]: I1006 08:36:53.752051 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-54876c876f-dnp2m" Oct 06 08:36:53 crc kubenswrapper[4991]: I1006 08:36:53.785371 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-699b87f775-h2655" Oct 06 08:36:53 crc kubenswrapper[4991]: I1006 08:36:53.861242 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-655d88ccb9-v7mb5" Oct 06 08:36:53 crc kubenswrapper[4991]: I1006 08:36:53.874867 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-wzcnb" Oct 06 08:36:53 crc kubenswrapper[4991]: I1006 08:36:53.894083 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z" Oct 06 08:36:53 crc kubenswrapper[4991]: I1006 08:36:53.918422 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-tq9r2" Oct 06 08:36:53 crc kubenswrapper[4991]: I1006 08:36:53.984632 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-qrvjh" Oct 06 08:36:54 crc kubenswrapper[4991]: I1006 08:36:54.009520 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vf4gj" Oct 06 08:36:54 crc kubenswrapper[4991]: I1006 08:36:54.202071 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-579449c7d5-29dxq" Oct 06 08:36:54 crc kubenswrapper[4991]: I1006 08:36:54.317535 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-nhl5c" Oct 06 08:36:54 crc kubenswrapper[4991]: I1006 08:36:54.974156 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" event={"ID":"65e5f035-09a2-492c-8474-9b6441c345a3","Type":"ContainerStarted","Data":"ae926fdd761d9c68bcace330d5cd9280c080f544ddc9c0b276190830242b7a51"} Oct 06 08:36:54 crc kubenswrapper[4991]: I1006 08:36:54.974707 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" Oct 06 08:36:54 crc kubenswrapper[4991]: I1006 08:36:54.976392 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" event={"ID":"1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f","Type":"ContainerStarted","Data":"571bb08ebfe0aaeb9cdb1a7e185e94452228633fae07d62714350fbf28c29b22"} Oct 06 08:36:54 crc kubenswrapper[4991]: I1006 08:36:54.976566 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" Oct 06 08:36:55 crc kubenswrapper[4991]: I1006 08:36:55.012581 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" podStartSLOduration=3.380720897 podStartE2EDuration="22.012557857s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.51102846 +0000 UTC m=+1047.248778481" lastFinishedPulling="2025-10-06 08:36:54.14286542 +0000 UTC m=+1065.880615441" observedRunningTime="2025-10-06 08:36:55.007916399 +0000 UTC m=+1066.745666440" watchObservedRunningTime="2025-10-06 08:36:55.012557857 +0000 UTC m=+1066.750307878" Oct 06 08:36:55 crc kubenswrapper[4991]: I1006 08:36:55.013144 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" podStartSLOduration=3.37941487 podStartE2EDuration="22.013137483s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.5106989 +0000 UTC m=+1047.248448921" lastFinishedPulling="2025-10-06 08:36:54.144421513 +0000 UTC m=+1065.882171534" observedRunningTime="2025-10-06 08:36:54.988780314 +0000 UTC m=+1066.726530355" watchObservedRunningTime="2025-10-06 08:36:55.013137483 +0000 UTC m=+1066.750887504" Oct 06 08:36:55 crc kubenswrapper[4991]: I1006 08:36:55.810835 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg" Oct 06 08:36:55 crc kubenswrapper[4991]: I1006 08:36:55.984006 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" event={"ID":"b503d08d-eaa0-4987-93e1-099a4ea00450","Type":"ContainerStarted","Data":"5924403432eb29c76fdffba22c2aec7edaa2ac1c24a9fce416148c248fdcef92"} Oct 06 08:36:55 crc kubenswrapper[4991]: I1006 08:36:55.984329 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" Oct 06 08:36:55 crc kubenswrapper[4991]: I1006 08:36:55.986168 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" event={"ID":"25827c7f-a146-4c7c-900f-670c747d6a15","Type":"ContainerStarted","Data":"3720b2ddf654724b13ad5d9307e72a9012702e9e241df50fec2714576a87ffcc"} Oct 06 08:36:55 crc kubenswrapper[4991]: I1006 08:36:55.986495 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" Oct 06 08:36:56 crc kubenswrapper[4991]: I1006 08:36:56.019843 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" podStartSLOduration=3.174338471 podStartE2EDuration="23.019822041s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.510908396 +0000 UTC m=+1047.248658417" lastFinishedPulling="2025-10-06 08:36:55.356391966 +0000 UTC m=+1067.094141987" observedRunningTime="2025-10-06 08:36:56.003063851 +0000 UTC m=+1067.740813872" watchObservedRunningTime="2025-10-06 08:36:56.019822041 +0000 UTC m=+1067.757572062" Oct 06 08:36:56 crc kubenswrapper[4991]: I1006 08:36:56.021010 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" podStartSLOduration=3.302909565 podStartE2EDuration="23.021003244s" podCreationTimestamp="2025-10-06 08:36:33 +0000 UTC" firstStartedPulling="2025-10-06 08:36:35.632864568 +0000 UTC m=+1047.370614599" lastFinishedPulling="2025-10-06 08:36:55.350958257 +0000 UTC m=+1067.088708278" observedRunningTime="2025-10-06 08:36:56.01868659 +0000 UTC m=+1067.756436641" watchObservedRunningTime="2025-10-06 08:36:56.021003244 +0000 UTC m=+1067.758753265" Oct 06 08:36:57 crc kubenswrapper[4991]: I1006 08:36:57.528830 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:36:57 crc kubenswrapper[4991]: I1006 08:36:57.528926 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:37:03 crc kubenswrapper[4991]: I1006 08:37:03.951594 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-mqlt4" Oct 06 08:37:03 crc kubenswrapper[4991]: I1006 08:37:03.976882 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-klmdp" Oct 06 08:37:04 crc kubenswrapper[4991]: I1006 08:37:04.341748 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-q5fql" Oct 06 08:37:04 crc kubenswrapper[4991]: I1006 08:37:04.367265 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-pghn2" Oct 06 08:37:04 crc kubenswrapper[4991]: I1006 08:37:04.376803 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-phjxk" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.297015 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bqkbr"] Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.299423 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.302393 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.302704 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.302765 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.302912 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-29vn2" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.311928 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bqkbr"] Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.374970 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mx7ks"] Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.401937 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mx7ks"] Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.402035 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.406270 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.466063 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6d56\" (UniqueName: \"kubernetes.io/projected/757b126b-cbc0-4390-a7e4-8223dee3aadb-kube-api-access-g6d56\") pod \"dnsmasq-dns-675f4bcbfc-bqkbr\" (UID: \"757b126b-cbc0-4390-a7e4-8223dee3aadb\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.466251 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757b126b-cbc0-4390-a7e4-8223dee3aadb-config\") pod \"dnsmasq-dns-675f4bcbfc-bqkbr\" (UID: \"757b126b-cbc0-4390-a7e4-8223dee3aadb\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.567438 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6d56\" (UniqueName: \"kubernetes.io/projected/757b126b-cbc0-4390-a7e4-8223dee3aadb-kube-api-access-g6d56\") pod \"dnsmasq-dns-675f4bcbfc-bqkbr\" (UID: \"757b126b-cbc0-4390-a7e4-8223dee3aadb\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.567578 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccc129da-febf-4041-b4cf-eabd00b0e163-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-mx7ks\" (UID: \"ccc129da-febf-4041-b4cf-eabd00b0e163\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.567596 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hqsm\" (UniqueName: \"kubernetes.io/projected/ccc129da-febf-4041-b4cf-eabd00b0e163-kube-api-access-5hqsm\") pod \"dnsmasq-dns-78dd6ddcc-mx7ks\" (UID: \"ccc129da-febf-4041-b4cf-eabd00b0e163\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.567637 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccc129da-febf-4041-b4cf-eabd00b0e163-config\") pod \"dnsmasq-dns-78dd6ddcc-mx7ks\" (UID: \"ccc129da-febf-4041-b4cf-eabd00b0e163\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.567682 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757b126b-cbc0-4390-a7e4-8223dee3aadb-config\") pod \"dnsmasq-dns-675f4bcbfc-bqkbr\" (UID: \"757b126b-cbc0-4390-a7e4-8223dee3aadb\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.568770 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757b126b-cbc0-4390-a7e4-8223dee3aadb-config\") pod \"dnsmasq-dns-675f4bcbfc-bqkbr\" (UID: \"757b126b-cbc0-4390-a7e4-8223dee3aadb\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.586045 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6d56\" (UniqueName: \"kubernetes.io/projected/757b126b-cbc0-4390-a7e4-8223dee3aadb-kube-api-access-g6d56\") pod \"dnsmasq-dns-675f4bcbfc-bqkbr\" (UID: \"757b126b-cbc0-4390-a7e4-8223dee3aadb\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.630522 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.668480 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccc129da-febf-4041-b4cf-eabd00b0e163-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-mx7ks\" (UID: \"ccc129da-febf-4041-b4cf-eabd00b0e163\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.668513 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hqsm\" (UniqueName: \"kubernetes.io/projected/ccc129da-febf-4041-b4cf-eabd00b0e163-kube-api-access-5hqsm\") pod \"dnsmasq-dns-78dd6ddcc-mx7ks\" (UID: \"ccc129da-febf-4041-b4cf-eabd00b0e163\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.668538 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccc129da-febf-4041-b4cf-eabd00b0e163-config\") pod \"dnsmasq-dns-78dd6ddcc-mx7ks\" (UID: \"ccc129da-febf-4041-b4cf-eabd00b0e163\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.669921 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccc129da-febf-4041-b4cf-eabd00b0e163-config\") pod \"dnsmasq-dns-78dd6ddcc-mx7ks\" (UID: \"ccc129da-febf-4041-b4cf-eabd00b0e163\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.670650 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccc129da-febf-4041-b4cf-eabd00b0e163-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-mx7ks\" (UID: \"ccc129da-febf-4041-b4cf-eabd00b0e163\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.686923 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hqsm\" (UniqueName: \"kubernetes.io/projected/ccc129da-febf-4041-b4cf-eabd00b0e163-kube-api-access-5hqsm\") pod \"dnsmasq-dns-78dd6ddcc-mx7ks\" (UID: \"ccc129da-febf-4041-b4cf-eabd00b0e163\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:18 crc kubenswrapper[4991]: I1006 08:37:18.740260 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:19 crc kubenswrapper[4991]: I1006 08:37:19.082348 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bqkbr"] Oct 06 08:37:19 crc kubenswrapper[4991]: I1006 08:37:19.088684 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:37:19 crc kubenswrapper[4991]: I1006 08:37:19.180867 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" event={"ID":"757b126b-cbc0-4390-a7e4-8223dee3aadb","Type":"ContainerStarted","Data":"844ae0f8045335a65d3ef4b423163b53e9548c925fd971d45ddf7c5533d3bd18"} Oct 06 08:37:19 crc kubenswrapper[4991]: I1006 08:37:19.214244 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mx7ks"] Oct 06 08:37:19 crc kubenswrapper[4991]: W1006 08:37:19.219397 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccc129da_febf_4041_b4cf_eabd00b0e163.slice/crio-715cf7089c1ccc92b2a46deb7d72455427b2cd2fa1b47f45508dc1961ace5f1a WatchSource:0}: Error finding container 715cf7089c1ccc92b2a46deb7d72455427b2cd2fa1b47f45508dc1961ace5f1a: Status 404 returned error can't find the container with id 715cf7089c1ccc92b2a46deb7d72455427b2cd2fa1b47f45508dc1961ace5f1a Oct 06 08:37:20 crc kubenswrapper[4991]: I1006 08:37:20.191514 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" event={"ID":"ccc129da-febf-4041-b4cf-eabd00b0e163","Type":"ContainerStarted","Data":"715cf7089c1ccc92b2a46deb7d72455427b2cd2fa1b47f45508dc1961ace5f1a"} Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.233229 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bqkbr"] Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.276602 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lv52b"] Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.281005 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.289194 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lv52b"] Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.303928 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh92d\" (UniqueName: \"kubernetes.io/projected/1f4ba1fc-cafd-47e7-812a-6041044f864b-kube-api-access-bh92d\") pod \"dnsmasq-dns-666b6646f7-lv52b\" (UID: \"1f4ba1fc-cafd-47e7-812a-6041044f864b\") " pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.304094 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f4ba1fc-cafd-47e7-812a-6041044f864b-config\") pod \"dnsmasq-dns-666b6646f7-lv52b\" (UID: \"1f4ba1fc-cafd-47e7-812a-6041044f864b\") " pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.304123 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f4ba1fc-cafd-47e7-812a-6041044f864b-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lv52b\" (UID: \"1f4ba1fc-cafd-47e7-812a-6041044f864b\") " pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.405282 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f4ba1fc-cafd-47e7-812a-6041044f864b-config\") pod \"dnsmasq-dns-666b6646f7-lv52b\" (UID: \"1f4ba1fc-cafd-47e7-812a-6041044f864b\") " pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.405341 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f4ba1fc-cafd-47e7-812a-6041044f864b-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lv52b\" (UID: \"1f4ba1fc-cafd-47e7-812a-6041044f864b\") " pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.405372 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh92d\" (UniqueName: \"kubernetes.io/projected/1f4ba1fc-cafd-47e7-812a-6041044f864b-kube-api-access-bh92d\") pod \"dnsmasq-dns-666b6646f7-lv52b\" (UID: \"1f4ba1fc-cafd-47e7-812a-6041044f864b\") " pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.406593 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f4ba1fc-cafd-47e7-812a-6041044f864b-dns-svc\") pod \"dnsmasq-dns-666b6646f7-lv52b\" (UID: \"1f4ba1fc-cafd-47e7-812a-6041044f864b\") " pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.406938 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f4ba1fc-cafd-47e7-812a-6041044f864b-config\") pod \"dnsmasq-dns-666b6646f7-lv52b\" (UID: \"1f4ba1fc-cafd-47e7-812a-6041044f864b\") " pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.453528 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh92d\" (UniqueName: \"kubernetes.io/projected/1f4ba1fc-cafd-47e7-812a-6041044f864b-kube-api-access-bh92d\") pod \"dnsmasq-dns-666b6646f7-lv52b\" (UID: \"1f4ba1fc-cafd-47e7-812a-6041044f864b\") " pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.542234 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mx7ks"] Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.579976 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-v7gwm"] Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.587056 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.590228 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-v7gwm"] Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.617098 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.618010 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f2b65d-d3ac-49ae-b398-f379a6bda788-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-v7gwm\" (UID: \"62f2b65d-d3ac-49ae-b398-f379a6bda788\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.618070 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4ctb\" (UniqueName: \"kubernetes.io/projected/62f2b65d-d3ac-49ae-b398-f379a6bda788-kube-api-access-h4ctb\") pod \"dnsmasq-dns-57d769cc4f-v7gwm\" (UID: \"62f2b65d-d3ac-49ae-b398-f379a6bda788\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.618123 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f2b65d-d3ac-49ae-b398-f379a6bda788-config\") pod \"dnsmasq-dns-57d769cc4f-v7gwm\" (UID: \"62f2b65d-d3ac-49ae-b398-f379a6bda788\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.718958 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f2b65d-d3ac-49ae-b398-f379a6bda788-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-v7gwm\" (UID: \"62f2b65d-d3ac-49ae-b398-f379a6bda788\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.719037 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4ctb\" (UniqueName: \"kubernetes.io/projected/62f2b65d-d3ac-49ae-b398-f379a6bda788-kube-api-access-h4ctb\") pod \"dnsmasq-dns-57d769cc4f-v7gwm\" (UID: \"62f2b65d-d3ac-49ae-b398-f379a6bda788\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.719080 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f2b65d-d3ac-49ae-b398-f379a6bda788-config\") pod \"dnsmasq-dns-57d769cc4f-v7gwm\" (UID: \"62f2b65d-d3ac-49ae-b398-f379a6bda788\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.720753 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f2b65d-d3ac-49ae-b398-f379a6bda788-config\") pod \"dnsmasq-dns-57d769cc4f-v7gwm\" (UID: \"62f2b65d-d3ac-49ae-b398-f379a6bda788\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.720998 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f2b65d-d3ac-49ae-b398-f379a6bda788-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-v7gwm\" (UID: \"62f2b65d-d3ac-49ae-b398-f379a6bda788\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.737932 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4ctb\" (UniqueName: \"kubernetes.io/projected/62f2b65d-d3ac-49ae-b398-f379a6bda788-kube-api-access-h4ctb\") pod \"dnsmasq-dns-57d769cc4f-v7gwm\" (UID: \"62f2b65d-d3ac-49ae-b398-f379a6bda788\") " pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:21 crc kubenswrapper[4991]: I1006 08:37:21.914150 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.155602 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lv52b"] Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.182732 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-v7gwm"] Oct 06 08:37:22 crc kubenswrapper[4991]: W1006 08:37:22.190843 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62f2b65d_d3ac_49ae_b398_f379a6bda788.slice/crio-9fd6a7fb85cf86beeb8c227ece104795edc80d4a9512f7c186a0631e5d929fa5 WatchSource:0}: Error finding container 9fd6a7fb85cf86beeb8c227ece104795edc80d4a9512f7c186a0631e5d929fa5: Status 404 returned error can't find the container with id 9fd6a7fb85cf86beeb8c227ece104795edc80d4a9512f7c186a0631e5d929fa5 Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.210674 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" event={"ID":"1f4ba1fc-cafd-47e7-812a-6041044f864b","Type":"ContainerStarted","Data":"ee49a38d1cbdec6d046749facabe7ed9dafde8ec72c597e06f1eebfdae116dad"} Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.212818 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" event={"ID":"62f2b65d-d3ac-49ae-b398-f379a6bda788","Type":"ContainerStarted","Data":"9fd6a7fb85cf86beeb8c227ece104795edc80d4a9512f7c186a0631e5d929fa5"} Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.411175 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.412783 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.414699 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.414895 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.418388 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.418938 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.419103 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.419151 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.420525 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7ntvc" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.425232 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.539665 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.539728 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.539748 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.539903 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.539945 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.539971 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.539986 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.540003 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.540130 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.540419 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.540456 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjlld\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-kube-api-access-kjlld\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.642073 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.642129 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.642175 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.642210 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.642240 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.642313 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.642337 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.642362 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.642455 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.643151 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjlld\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-kube-api-access-kjlld\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.643187 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.643201 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.643218 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.643203 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.643280 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.643426 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.643697 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.647826 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.654055 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.659387 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.663693 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.668205 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjlld\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-kube-api-access-kjlld\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.676485 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.677595 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.682312 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.687010 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5xntt" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.687311 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.687351 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.687488 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.687507 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.687694 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.687752 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.687696 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.746634 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.846136 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.846204 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.846259 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.846319 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7987\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-kube-api-access-w7987\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.846352 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.846384 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.846409 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.846857 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.846975 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.847016 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.847162 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.951227 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.951287 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.951350 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.951366 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.951414 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7987\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-kube-api-access-w7987\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.951439 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.951461 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.951477 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.951501 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.951519 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.951537 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.952402 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.952566 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.952665 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.952915 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.953131 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.954921 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.956241 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.956714 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.957530 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.957745 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.975042 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7987\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-kube-api-access-w7987\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:22 crc kubenswrapper[4991]: I1006 08:37:22.993926 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:23 crc kubenswrapper[4991]: I1006 08:37:23.039608 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:37:23 crc kubenswrapper[4991]: I1006 08:37:23.229668 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:37:23 crc kubenswrapper[4991]: W1006 08:37:23.253436 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53c6aca4_4fd0_4d42_bbe2_4b6e91643503.slice/crio-396fb26a587e0452b10628b69988716675f31756c67979296f98a06fd8e6573a WatchSource:0}: Error finding container 396fb26a587e0452b10628b69988716675f31756c67979296f98a06fd8e6573a: Status 404 returned error can't find the container with id 396fb26a587e0452b10628b69988716675f31756c67979296f98a06fd8e6573a Oct 06 08:37:23 crc kubenswrapper[4991]: I1006 08:37:23.578508 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:37:23 crc kubenswrapper[4991]: W1006 08:37:23.591112 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e8ba650_c3ef_45bd_ac9b_daaa4889c2f1.slice/crio-4c1ea580b6f933663c7bb9be9554723e3af455aced231068c8eb2051af6ac836 WatchSource:0}: Error finding container 4c1ea580b6f933663c7bb9be9554723e3af455aced231068c8eb2051af6ac836: Status 404 returned error can't find the container with id 4c1ea580b6f933663c7bb9be9554723e3af455aced231068c8eb2051af6ac836 Oct 06 08:37:24 crc kubenswrapper[4991]: I1006 08:37:24.236065 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1","Type":"ContainerStarted","Data":"4c1ea580b6f933663c7bb9be9554723e3af455aced231068c8eb2051af6ac836"} Oct 06 08:37:24 crc kubenswrapper[4991]: I1006 08:37:24.237980 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53c6aca4-4fd0-4d42-bbe2-4b6e91643503","Type":"ContainerStarted","Data":"396fb26a587e0452b10628b69988716675f31756c67979296f98a06fd8e6573a"} Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.432613 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.437931 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.441212 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.441687 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.442788 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ncj8k" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.444938 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.445814 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.447083 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.447397 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.567124 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.568567 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.576639 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.576904 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-nfnn7" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.576974 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.581315 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.586646 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.608016 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.610443 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.612933 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.612976 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4wnb\" (UniqueName: \"kubernetes.io/projected/157f3f65-3397-4a2d-98ea-1ae5897c7a76-kube-api-access-h4wnb\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.613044 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/157f3f65-3397-4a2d-98ea-1ae5897c7a76-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.613102 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.613173 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.613194 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.613957 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.712003 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.713260 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.722620 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-56plg" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.722840 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.722865 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.722905 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.722932 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5cbx\" (UniqueName: \"kubernetes.io/projected/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-kube-api-access-c5cbx\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.722963 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.722987 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-config-data-default\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723004 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723026 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723047 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723064 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723108 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723130 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723148 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723166 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4wnb\" (UniqueName: \"kubernetes.io/projected/157f3f65-3397-4a2d-98ea-1ae5897c7a76-kube-api-access-h4wnb\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723189 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/157f3f65-3397-4a2d-98ea-1ae5897c7a76-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723205 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-kolla-config\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723228 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-secrets\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723244 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723262 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723479 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.723607 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.724489 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.725113 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/157f3f65-3397-4a2d-98ea-1ae5897c7a76-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.725964 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.726748 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.739907 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.752332 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.752513 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.752811 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.758224 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.760187 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4wnb\" (UniqueName: \"kubernetes.io/projected/157f3f65-3397-4a2d-98ea-1ae5897c7a76-kube-api-access-h4wnb\") pod \"openstack-cell1-galera-0\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.829336 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.829414 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5cbx\" (UniqueName: \"kubernetes.io/projected/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-kube-api-access-c5cbx\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.829450 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-kolla-config\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.829508 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.829524 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-config-data-default\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.829589 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.829785 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.829865 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-config-data\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.829907 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.829949 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.829986 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-kolla-config\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.830031 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtxm2\" (UniqueName: \"kubernetes.io/projected/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-kube-api-access-mtxm2\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.830062 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-secrets\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.830792 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.831539 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-config-data-default\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.831573 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.831917 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-kolla-config\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.834553 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.836243 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.836765 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.836955 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-secrets\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.840172 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.853517 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5cbx\" (UniqueName: \"kubernetes.io/projected/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-kube-api-access-c5cbx\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.883460 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.897010 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.936757 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtxm2\" (UniqueName: \"kubernetes.io/projected/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-kube-api-access-mtxm2\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.936838 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.936871 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-kolla-config\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.936909 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-config-data\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.936933 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.938222 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-config-data\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.939043 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-kolla-config\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.941222 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.942994 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:25 crc kubenswrapper[4991]: I1006 08:37:25.961818 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtxm2\" (UniqueName: \"kubernetes.io/projected/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-kube-api-access-mtxm2\") pod \"memcached-0\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " pod="openstack/memcached-0" Oct 06 08:37:26 crc kubenswrapper[4991]: I1006 08:37:26.060753 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:26 crc kubenswrapper[4991]: I1006 08:37:26.103258 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 08:37:27 crc kubenswrapper[4991]: I1006 08:37:27.529315 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:37:27 crc kubenswrapper[4991]: I1006 08:37:27.529374 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:37:27 crc kubenswrapper[4991]: I1006 08:37:27.914715 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:37:27 crc kubenswrapper[4991]: I1006 08:37:27.915912 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:37:27 crc kubenswrapper[4991]: I1006 08:37:27.918178 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-mnnj6" Oct 06 08:37:27 crc kubenswrapper[4991]: I1006 08:37:27.924789 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:37:28 crc kubenswrapper[4991]: I1006 08:37:28.068687 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5gkk\" (UniqueName: \"kubernetes.io/projected/849ecb33-22bf-43e5-ac4c-9a3e5cc0c668-kube-api-access-p5gkk\") pod \"kube-state-metrics-0\" (UID: \"849ecb33-22bf-43e5-ac4c-9a3e5cc0c668\") " pod="openstack/kube-state-metrics-0" Oct 06 08:37:28 crc kubenswrapper[4991]: I1006 08:37:28.170374 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5gkk\" (UniqueName: \"kubernetes.io/projected/849ecb33-22bf-43e5-ac4c-9a3e5cc0c668-kube-api-access-p5gkk\") pod \"kube-state-metrics-0\" (UID: \"849ecb33-22bf-43e5-ac4c-9a3e5cc0c668\") " pod="openstack/kube-state-metrics-0" Oct 06 08:37:28 crc kubenswrapper[4991]: I1006 08:37:28.192049 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5gkk\" (UniqueName: \"kubernetes.io/projected/849ecb33-22bf-43e5-ac4c-9a3e5cc0c668-kube-api-access-p5gkk\") pod \"kube-state-metrics-0\" (UID: \"849ecb33-22bf-43e5-ac4c-9a3e5cc0c668\") " pod="openstack/kube-state-metrics-0" Oct 06 08:37:28 crc kubenswrapper[4991]: I1006 08:37:28.278226 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.725969 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jklxx"] Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.727553 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.729112 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.730343 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-p96wf" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.732602 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.747806 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jklxx"] Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.754999 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5prwt"] Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.757150 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.788735 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5prwt"] Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835377 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-run\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835421 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188f566f-7d4a-4b9f-b74d-bbee761c0bea-combined-ca-bundle\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835452 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-log-ovn\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835472 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-run\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835500 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpf54\" (UniqueName: \"kubernetes.io/projected/63c7d8f9-5c85-4999-b60b-517b03ff5992-kube-api-access-rpf54\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835527 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-log\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835604 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-lib\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835628 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63c7d8f9-5c85-4999-b60b-517b03ff5992-scripts\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835647 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp8dl\" (UniqueName: \"kubernetes.io/projected/188f566f-7d4a-4b9f-b74d-bbee761c0bea-kube-api-access-sp8dl\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835673 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/188f566f-7d4a-4b9f-b74d-bbee761c0bea-ovn-controller-tls-certs\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835721 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/188f566f-7d4a-4b9f-b74d-bbee761c0bea-scripts\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835745 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-run-ovn\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.835840 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-etc-ovs\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.936923 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-lib\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.936966 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63c7d8f9-5c85-4999-b60b-517b03ff5992-scripts\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.936983 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp8dl\" (UniqueName: \"kubernetes.io/projected/188f566f-7d4a-4b9f-b74d-bbee761c0bea-kube-api-access-sp8dl\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.937003 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/188f566f-7d4a-4b9f-b74d-bbee761c0bea-ovn-controller-tls-certs\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.937021 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/188f566f-7d4a-4b9f-b74d-bbee761c0bea-scripts\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.937040 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-run-ovn\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.937062 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-etc-ovs\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.937108 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-run\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.937126 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188f566f-7d4a-4b9f-b74d-bbee761c0bea-combined-ca-bundle\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.937141 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-log-ovn\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.937161 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-run\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.937192 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpf54\" (UniqueName: \"kubernetes.io/projected/63c7d8f9-5c85-4999-b60b-517b03ff5992-kube-api-access-rpf54\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.937212 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-log\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.937700 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-log\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.938422 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-lib\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.939648 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63c7d8f9-5c85-4999-b60b-517b03ff5992-scripts\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.939852 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-run\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.939941 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/188f566f-7d4a-4b9f-b74d-bbee761c0bea-scripts\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.939957 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-log-ovn\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.940136 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-etc-ovs\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.940235 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-run-ovn\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.940321 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-run\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.942942 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188f566f-7d4a-4b9f-b74d-bbee761c0bea-combined-ca-bundle\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.949561 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/188f566f-7d4a-4b9f-b74d-bbee761c0bea-ovn-controller-tls-certs\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.957529 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp8dl\" (UniqueName: \"kubernetes.io/projected/188f566f-7d4a-4b9f-b74d-bbee761c0bea-kube-api-access-sp8dl\") pod \"ovn-controller-jklxx\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " pod="openstack/ovn-controller-jklxx" Oct 06 08:37:31 crc kubenswrapper[4991]: I1006 08:37:31.966007 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpf54\" (UniqueName: \"kubernetes.io/projected/63c7d8f9-5c85-4999-b60b-517b03ff5992-kube-api-access-rpf54\") pod \"ovn-controller-ovs-5prwt\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.060078 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jklxx" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.076748 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.219662 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.221020 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.223195 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.223228 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.223269 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.224266 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.228990 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.230435 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-cqz27" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.343087 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qml8q\" (UniqueName: \"kubernetes.io/projected/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-kube-api-access-qml8q\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.343177 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.343217 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.343242 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.343267 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.343327 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.343352 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.343388 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-config\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.444903 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.444960 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.444998 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-config\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.445081 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qml8q\" (UniqueName: \"kubernetes.io/projected/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-kube-api-access-qml8q\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.445115 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.445139 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.445172 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.445194 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.445360 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.445677 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.446819 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.446968 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-config\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.459089 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.459248 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.459695 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.462066 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qml8q\" (UniqueName: \"kubernetes.io/projected/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-kube-api-access-qml8q\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.466665 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:32 crc kubenswrapper[4991]: I1006 08:37:32.553134 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.236587 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.238092 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.239760 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8q2ff" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.240448 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.240576 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.244708 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.267615 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.437644 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b135498-feb3-4024-b655-92f403f55bb9-config\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.437731 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.437825 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b135498-feb3-4024-b655-92f403f55bb9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.437873 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.437942 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.438087 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1b135498-feb3-4024-b655-92f403f55bb9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.438203 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.438261 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95x62\" (UniqueName: \"kubernetes.io/projected/1b135498-feb3-4024-b655-92f403f55bb9-kube-api-access-95x62\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.539575 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.539645 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95x62\" (UniqueName: \"kubernetes.io/projected/1b135498-feb3-4024-b655-92f403f55bb9-kube-api-access-95x62\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.539735 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b135498-feb3-4024-b655-92f403f55bb9-config\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.539770 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.539821 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b135498-feb3-4024-b655-92f403f55bb9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.539844 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.539873 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.539929 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1b135498-feb3-4024-b655-92f403f55bb9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.540495 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1b135498-feb3-4024-b655-92f403f55bb9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.540647 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.542403 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b135498-feb3-4024-b655-92f403f55bb9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.542628 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b135498-feb3-4024-b655-92f403f55bb9-config\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.546104 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.546892 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.548031 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.562430 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95x62\" (UniqueName: \"kubernetes.io/projected/1b135498-feb3-4024-b655-92f403f55bb9-kube-api-access-95x62\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.566877 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:35 crc kubenswrapper[4991]: I1006 08:37:35.577274 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.106329 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.106728 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7987,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.107938 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.141214 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.141497 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kjlld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(53c6aca4-4fd0-4d42-bbe2-4b6e91643503): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.142661 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="53c6aca4-4fd0-4d42-bbe2-4b6e91643503" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.371515 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="53c6aca4-4fd0-4d42-bbe2-4b6e91643503" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.371603 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.810657 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.811040 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bh92d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-lv52b_openstack(1f4ba1fc-cafd-47e7-812a-6041044f864b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.814571 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" podUID="1f4ba1fc-cafd-47e7-812a-6041044f864b" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.818788 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.818945 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6d56,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-bqkbr_openstack(757b126b-cbc0-4390-a7e4-8223dee3aadb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.820013 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.820061 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" podUID="757b126b-cbc0-4390-a7e4-8223dee3aadb" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.820193 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5hqsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-mx7ks_openstack(ccc129da-febf-4041-b4cf-eabd00b0e163): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.821989 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" podUID="ccc129da-febf-4041-b4cf-eabd00b0e163" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.831787 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.831976 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h4ctb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-v7gwm_openstack(62f2b65d-d3ac-49ae-b398-f379a6bda788): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:37:38 crc kubenswrapper[4991]: E1006 08:37:38.833339 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" podUID="62f2b65d-d3ac-49ae-b398-f379a6bda788" Oct 06 08:37:39 crc kubenswrapper[4991]: I1006 08:37:39.335780 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 08:37:39 crc kubenswrapper[4991]: W1006 08:37:39.360943 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1f986ad_8a8d_44d3_b200_479a60f8b8b3.slice/crio-f7fb7232d6c2b21d3774a6e1ede7c12929d141e3b2231f4a00567a59297a81bf WatchSource:0}: Error finding container f7fb7232d6c2b21d3774a6e1ede7c12929d141e3b2231f4a00567a59297a81bf: Status 404 returned error can't find the container with id f7fb7232d6c2b21d3774a6e1ede7c12929d141e3b2231f4a00567a59297a81bf Oct 06 08:37:39 crc kubenswrapper[4991]: I1006 08:37:39.382923 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d1f986ad-8a8d-44d3-b200-479a60f8b8b3","Type":"ContainerStarted","Data":"f7fb7232d6c2b21d3774a6e1ede7c12929d141e3b2231f4a00567a59297a81bf"} Oct 06 08:37:39 crc kubenswrapper[4991]: E1006 08:37:39.388682 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" podUID="1f4ba1fc-cafd-47e7-812a-6041044f864b" Oct 06 08:37:39 crc kubenswrapper[4991]: E1006 08:37:39.389183 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" podUID="62f2b65d-d3ac-49ae-b398-f379a6bda788" Oct 06 08:37:39 crc kubenswrapper[4991]: I1006 08:37:39.445487 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 08:37:39 crc kubenswrapper[4991]: I1006 08:37:39.459090 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:37:39 crc kubenswrapper[4991]: I1006 08:37:39.508418 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 08:37:39 crc kubenswrapper[4991]: I1006 08:37:39.518899 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jklxx"] Oct 06 08:37:39 crc kubenswrapper[4991]: I1006 08:37:39.559886 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 08:37:39 crc kubenswrapper[4991]: I1006 08:37:39.844004 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5prwt"] Oct 06 08:37:39 crc kubenswrapper[4991]: I1006 08:37:39.920472 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" Oct 06 08:37:39 crc kubenswrapper[4991]: I1006 08:37:39.928201 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.040336 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757b126b-cbc0-4390-a7e4-8223dee3aadb-config\") pod \"757b126b-cbc0-4390-a7e4-8223dee3aadb\" (UID: \"757b126b-cbc0-4390-a7e4-8223dee3aadb\") " Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.040526 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccc129da-febf-4041-b4cf-eabd00b0e163-config\") pod \"ccc129da-febf-4041-b4cf-eabd00b0e163\" (UID: \"ccc129da-febf-4041-b4cf-eabd00b0e163\") " Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.040604 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hqsm\" (UniqueName: \"kubernetes.io/projected/ccc129da-febf-4041-b4cf-eabd00b0e163-kube-api-access-5hqsm\") pod \"ccc129da-febf-4041-b4cf-eabd00b0e163\" (UID: \"ccc129da-febf-4041-b4cf-eabd00b0e163\") " Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.040685 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccc129da-febf-4041-b4cf-eabd00b0e163-dns-svc\") pod \"ccc129da-febf-4041-b4cf-eabd00b0e163\" (UID: \"ccc129da-febf-4041-b4cf-eabd00b0e163\") " Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.040817 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6d56\" (UniqueName: \"kubernetes.io/projected/757b126b-cbc0-4390-a7e4-8223dee3aadb-kube-api-access-g6d56\") pod \"757b126b-cbc0-4390-a7e4-8223dee3aadb\" (UID: \"757b126b-cbc0-4390-a7e4-8223dee3aadb\") " Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.040906 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757b126b-cbc0-4390-a7e4-8223dee3aadb-config" (OuterVolumeSpecName: "config") pod "757b126b-cbc0-4390-a7e4-8223dee3aadb" (UID: "757b126b-cbc0-4390-a7e4-8223dee3aadb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.040928 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccc129da-febf-4041-b4cf-eabd00b0e163-config" (OuterVolumeSpecName: "config") pod "ccc129da-febf-4041-b4cf-eabd00b0e163" (UID: "ccc129da-febf-4041-b4cf-eabd00b0e163"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.041147 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccc129da-febf-4041-b4cf-eabd00b0e163-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ccc129da-febf-4041-b4cf-eabd00b0e163" (UID: "ccc129da-febf-4041-b4cf-eabd00b0e163"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.041433 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757b126b-cbc0-4390-a7e4-8223dee3aadb-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.041453 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccc129da-febf-4041-b4cf-eabd00b0e163-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.041467 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccc129da-febf-4041-b4cf-eabd00b0e163-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.046080 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757b126b-cbc0-4390-a7e4-8223dee3aadb-kube-api-access-g6d56" (OuterVolumeSpecName: "kube-api-access-g6d56") pod "757b126b-cbc0-4390-a7e4-8223dee3aadb" (UID: "757b126b-cbc0-4390-a7e4-8223dee3aadb"). InnerVolumeSpecName "kube-api-access-g6d56". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.046581 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccc129da-febf-4041-b4cf-eabd00b0e163-kube-api-access-5hqsm" (OuterVolumeSpecName: "kube-api-access-5hqsm") pod "ccc129da-febf-4041-b4cf-eabd00b0e163" (UID: "ccc129da-febf-4041-b4cf-eabd00b0e163"). InnerVolumeSpecName "kube-api-access-5hqsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.142528 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6d56\" (UniqueName: \"kubernetes.io/projected/757b126b-cbc0-4390-a7e4-8223dee3aadb-kube-api-access-g6d56\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.142741 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hqsm\" (UniqueName: \"kubernetes.io/projected/ccc129da-febf-4041-b4cf-eabd00b0e163-kube-api-access-5hqsm\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.392683 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5prwt" event={"ID":"63c7d8f9-5c85-4999-b60b-517b03ff5992","Type":"ContainerStarted","Data":"3faa43e3ebe5b0a934ddcdf3553d1b39c5fd37531efe75285afb4fc1de61c554"} Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.398002 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1b135498-feb3-4024-b655-92f403f55bb9","Type":"ContainerStarted","Data":"222efa71f1bf216a77d131176cd2facbb90b30333355df78c9608e6b61ee430c"} Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.400252 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" event={"ID":"ccc129da-febf-4041-b4cf-eabd00b0e163","Type":"ContainerDied","Data":"715cf7089c1ccc92b2a46deb7d72455427b2cd2fa1b47f45508dc1961ace5f1a"} Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.400265 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mx7ks" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.402139 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"157f3f65-3397-4a2d-98ea-1ae5897c7a76","Type":"ContainerStarted","Data":"19bfa8422af3ebc6a59245353778ba8be20c0c90fd0573ce41a887263f3cbe62"} Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.404217 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"849ecb33-22bf-43e5-ac4c-9a3e5cc0c668","Type":"ContainerStarted","Data":"c6ba945d618e0b9e5bedd388788e1f9ef676b3fab13c3fa86a94ca5c6129ab4e"} Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.405116 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jklxx" event={"ID":"188f566f-7d4a-4b9f-b74d-bbee761c0bea","Type":"ContainerStarted","Data":"18957892ea4516db63f3ea0c7c12689f3d3ac981b8c90d15a7a673283632396e"} Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.406119 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.406211 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" event={"ID":"757b126b-cbc0-4390-a7e4-8223dee3aadb","Type":"ContainerDied","Data":"844ae0f8045335a65d3ef4b423163b53e9548c925fd971d45ddf7c5533d3bd18"} Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.406236 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bqkbr" Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.409012 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"033164fc-5a6f-4b9d-8c3a-1e4242078c9e","Type":"ContainerStarted","Data":"44882ae90323aa32d1ae693d1acd497085921d30fdb2bc915ad61024da4420d8"} Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.459338 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mx7ks"] Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.481554 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mx7ks"] Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.494512 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bqkbr"] Oct 06 08:37:40 crc kubenswrapper[4991]: I1006 08:37:40.501004 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bqkbr"] Oct 06 08:37:41 crc kubenswrapper[4991]: I1006 08:37:41.256216 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757b126b-cbc0-4390-a7e4-8223dee3aadb" path="/var/lib/kubelet/pods/757b126b-cbc0-4390-a7e4-8223dee3aadb/volumes" Oct 06 08:37:41 crc kubenswrapper[4991]: I1006 08:37:41.256683 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccc129da-febf-4041-b4cf-eabd00b0e163" path="/var/lib/kubelet/pods/ccc129da-febf-4041-b4cf-eabd00b0e163/volumes" Oct 06 08:37:41 crc kubenswrapper[4991]: I1006 08:37:41.416521 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6ad6d483-bca3-4391-9e4c-290b6b15b1f4","Type":"ContainerStarted","Data":"4ca1911997e9bbd3466058aaa12c187f5778061ad0ecd52fb889992119a4044f"} Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.470333 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"033164fc-5a6f-4b9d-8c3a-1e4242078c9e","Type":"ContainerStarted","Data":"beb72f2fe0b1d7a0ddb9249baead2d79de7b72973b4fde75ed6d9bc96c982e97"} Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.471420 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.474799 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5prwt" event={"ID":"63c7d8f9-5c85-4999-b60b-517b03ff5992","Type":"ContainerStarted","Data":"7361661faa0dd965eb9150f74e1354d3da89ded19ead38c6742a02c9d3302dbc"} Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.476943 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1b135498-feb3-4024-b655-92f403f55bb9","Type":"ContainerStarted","Data":"36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b"} Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.479496 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6ad6d483-bca3-4391-9e4c-290b6b15b1f4","Type":"ContainerStarted","Data":"d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e"} Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.481548 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"157f3f65-3397-4a2d-98ea-1ae5897c7a76","Type":"ContainerStarted","Data":"367eeb397b00a7696d851f72cefdac0146f8753511bf7f8e96400955a3dea1fd"} Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.483360 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d1f986ad-8a8d-44d3-b200-479a60f8b8b3","Type":"ContainerStarted","Data":"96202f33a285fac8c19a37ae1518bcc574673c7c2cd374e1b1aafd4665b248fc"} Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.492170 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"849ecb33-22bf-43e5-ac4c-9a3e5cc0c668","Type":"ContainerStarted","Data":"280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca"} Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.492308 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.495815 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jklxx" event={"ID":"188f566f-7d4a-4b9f-b74d-bbee761c0bea","Type":"ContainerStarted","Data":"9ce5d29d20258b039798707827257132b489fc0eb7a72cd595e6df3602e635af"} Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.495974 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-jklxx" Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.499117 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.834956563 podStartE2EDuration="23.499100656s" podCreationTimestamp="2025-10-06 08:37:25 +0000 UTC" firstStartedPulling="2025-10-06 08:37:39.490756511 +0000 UTC m=+1111.228506532" lastFinishedPulling="2025-10-06 08:37:47.154900604 +0000 UTC m=+1118.892650625" observedRunningTime="2025-10-06 08:37:48.497419748 +0000 UTC m=+1120.235169779" watchObservedRunningTime="2025-10-06 08:37:48.499100656 +0000 UTC m=+1120.236850677" Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.570399 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.116027872 podStartE2EDuration="21.570382096s" podCreationTimestamp="2025-10-06 08:37:27 +0000 UTC" firstStartedPulling="2025-10-06 08:37:39.48576563 +0000 UTC m=+1111.223515651" lastFinishedPulling="2025-10-06 08:37:47.940119854 +0000 UTC m=+1119.677869875" observedRunningTime="2025-10-06 08:37:48.56769035 +0000 UTC m=+1120.305440371" watchObservedRunningTime="2025-10-06 08:37:48.570382096 +0000 UTC m=+1120.308132117" Oct 06 08:37:48 crc kubenswrapper[4991]: I1006 08:37:48.612153 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-jklxx" podStartSLOduration=9.271886966 podStartE2EDuration="17.612132613s" podCreationTimestamp="2025-10-06 08:37:31 +0000 UTC" firstStartedPulling="2025-10-06 08:37:39.526357725 +0000 UTC m=+1111.264107746" lastFinishedPulling="2025-10-06 08:37:47.866603372 +0000 UTC m=+1119.604353393" observedRunningTime="2025-10-06 08:37:48.61166305 +0000 UTC m=+1120.349413071" watchObservedRunningTime="2025-10-06 08:37:48.612132613 +0000 UTC m=+1120.349882634" Oct 06 08:37:49 crc kubenswrapper[4991]: I1006 08:37:49.505382 4991 generic.go:334] "Generic (PLEG): container finished" podID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerID="7361661faa0dd965eb9150f74e1354d3da89ded19ead38c6742a02c9d3302dbc" exitCode=0 Oct 06 08:37:49 crc kubenswrapper[4991]: I1006 08:37:49.505527 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5prwt" event={"ID":"63c7d8f9-5c85-4999-b60b-517b03ff5992","Type":"ContainerDied","Data":"7361661faa0dd965eb9150f74e1354d3da89ded19ead38c6742a02c9d3302dbc"} Oct 06 08:37:50 crc kubenswrapper[4991]: I1006 08:37:50.513564 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5prwt" event={"ID":"63c7d8f9-5c85-4999-b60b-517b03ff5992","Type":"ContainerStarted","Data":"2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7"} Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.521821 4991 generic.go:334] "Generic (PLEG): container finished" podID="d1f986ad-8a8d-44d3-b200-479a60f8b8b3" containerID="96202f33a285fac8c19a37ae1518bcc574673c7c2cd374e1b1aafd4665b248fc" exitCode=0 Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.521920 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d1f986ad-8a8d-44d3-b200-479a60f8b8b3","Type":"ContainerDied","Data":"96202f33a285fac8c19a37ae1518bcc574673c7c2cd374e1b1aafd4665b248fc"} Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.527868 4991 generic.go:334] "Generic (PLEG): container finished" podID="1f4ba1fc-cafd-47e7-812a-6041044f864b" containerID="b21a223f2132057244752ca4451d6757a04c50aa0e4d3df32176e63157c95020" exitCode=0 Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.527944 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" event={"ID":"1f4ba1fc-cafd-47e7-812a-6041044f864b","Type":"ContainerDied","Data":"b21a223f2132057244752ca4451d6757a04c50aa0e4d3df32176e63157c95020"} Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.534227 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5prwt" event={"ID":"63c7d8f9-5c85-4999-b60b-517b03ff5992","Type":"ContainerStarted","Data":"6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159"} Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.534363 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.538117 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1b135498-feb3-4024-b655-92f403f55bb9","Type":"ContainerStarted","Data":"b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9"} Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.548693 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6ad6d483-bca3-4391-9e4c-290b6b15b1f4","Type":"ContainerStarted","Data":"10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1"} Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.551452 4991 generic.go:334] "Generic (PLEG): container finished" podID="157f3f65-3397-4a2d-98ea-1ae5897c7a76" containerID="367eeb397b00a7696d851f72cefdac0146f8753511bf7f8e96400955a3dea1fd" exitCode=0 Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.551490 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"157f3f65-3397-4a2d-98ea-1ae5897c7a76","Type":"ContainerDied","Data":"367eeb397b00a7696d851f72cefdac0146f8753511bf7f8e96400955a3dea1fd"} Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.566365 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5prwt" podStartSLOduration=13.110242633 podStartE2EDuration="20.566349781s" podCreationTimestamp="2025-10-06 08:37:31 +0000 UTC" firstStartedPulling="2025-10-06 08:37:39.850511184 +0000 UTC m=+1111.588261205" lastFinishedPulling="2025-10-06 08:37:47.306618322 +0000 UTC m=+1119.044368353" observedRunningTime="2025-10-06 08:37:51.559885629 +0000 UTC m=+1123.297635650" watchObservedRunningTime="2025-10-06 08:37:51.566349781 +0000 UTC m=+1123.304099802" Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.605055 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.133630357 podStartE2EDuration="17.605032782s" podCreationTimestamp="2025-10-06 08:37:34 +0000 UTC" firstStartedPulling="2025-10-06 08:37:39.578541116 +0000 UTC m=+1111.316291147" lastFinishedPulling="2025-10-06 08:37:51.049943551 +0000 UTC m=+1122.787693572" observedRunningTime="2025-10-06 08:37:51.601473642 +0000 UTC m=+1123.339223683" watchObservedRunningTime="2025-10-06 08:37:51.605032782 +0000 UTC m=+1123.342782793" Oct 06 08:37:51 crc kubenswrapper[4991]: I1006 08:37:51.643828 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.032058311 podStartE2EDuration="20.643805536s" podCreationTimestamp="2025-10-06 08:37:31 +0000 UTC" firstStartedPulling="2025-10-06 08:37:40.420145947 +0000 UTC m=+1112.157895968" lastFinishedPulling="2025-10-06 08:37:51.031893172 +0000 UTC m=+1122.769643193" observedRunningTime="2025-10-06 08:37:51.639531246 +0000 UTC m=+1123.377281267" watchObservedRunningTime="2025-10-06 08:37:51.643805536 +0000 UTC m=+1123.381555577" Oct 06 08:37:52 crc kubenswrapper[4991]: I1006 08:37:52.077561 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:37:52 crc kubenswrapper[4991]: I1006 08:37:52.553639 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:52 crc kubenswrapper[4991]: I1006 08:37:52.565455 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d1f986ad-8a8d-44d3-b200-479a60f8b8b3","Type":"ContainerStarted","Data":"9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc"} Oct 06 08:37:52 crc kubenswrapper[4991]: I1006 08:37:52.569362 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" event={"ID":"1f4ba1fc-cafd-47e7-812a-6041044f864b","Type":"ContainerStarted","Data":"7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc"} Oct 06 08:37:52 crc kubenswrapper[4991]: I1006 08:37:52.569667 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:52 crc kubenswrapper[4991]: I1006 08:37:52.571941 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1","Type":"ContainerStarted","Data":"545338f0b083a0c7cbfb9d9da6676198ff08693a5b75a48c77eeafe59d4fe381"} Oct 06 08:37:52 crc kubenswrapper[4991]: I1006 08:37:52.575537 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"157f3f65-3397-4a2d-98ea-1ae5897c7a76","Type":"ContainerStarted","Data":"94c589983290634c76235daa1990cab452138af9c99951302ddc413d46fc20a4"} Oct 06 08:37:52 crc kubenswrapper[4991]: I1006 08:37:52.611899 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.672273782 podStartE2EDuration="28.611860822s" podCreationTimestamp="2025-10-06 08:37:24 +0000 UTC" firstStartedPulling="2025-10-06 08:37:39.364184112 +0000 UTC m=+1111.101934133" lastFinishedPulling="2025-10-06 08:37:47.303771162 +0000 UTC m=+1119.041521173" observedRunningTime="2025-10-06 08:37:52.598967378 +0000 UTC m=+1124.336717439" watchObservedRunningTime="2025-10-06 08:37:52.611860822 +0000 UTC m=+1124.349610893" Oct 06 08:37:52 crc kubenswrapper[4991]: I1006 08:37:52.633872 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.735819563 podStartE2EDuration="28.633852352s" podCreationTimestamp="2025-10-06 08:37:24 +0000 UTC" firstStartedPulling="2025-10-06 08:37:39.486449099 +0000 UTC m=+1111.224199120" lastFinishedPulling="2025-10-06 08:37:47.384481888 +0000 UTC m=+1119.122231909" observedRunningTime="2025-10-06 08:37:52.629746076 +0000 UTC m=+1124.367496137" watchObservedRunningTime="2025-10-06 08:37:52.633852352 +0000 UTC m=+1124.371602363" Oct 06 08:37:52 crc kubenswrapper[4991]: I1006 08:37:52.652736 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" podStartSLOduration=2.789921161 podStartE2EDuration="31.652705063s" podCreationTimestamp="2025-10-06 08:37:21 +0000 UTC" firstStartedPulling="2025-10-06 08:37:22.172042023 +0000 UTC m=+1093.909792044" lastFinishedPulling="2025-10-06 08:37:51.034825925 +0000 UTC m=+1122.772575946" observedRunningTime="2025-10-06 08:37:52.644972375 +0000 UTC m=+1124.382722416" watchObservedRunningTime="2025-10-06 08:37:52.652705063 +0000 UTC m=+1124.390455084" Oct 06 08:37:53 crc kubenswrapper[4991]: I1006 08:37:53.553840 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:53 crc kubenswrapper[4991]: I1006 08:37:53.578492 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:53 crc kubenswrapper[4991]: I1006 08:37:53.586542 4991 generic.go:334] "Generic (PLEG): container finished" podID="62f2b65d-d3ac-49ae-b398-f379a6bda788" containerID="961c029b7826b53f3e7863c63f1836bee875c60ff23108f70eb6112dd8b9f366" exitCode=0 Oct 06 08:37:53 crc kubenswrapper[4991]: I1006 08:37:53.586734 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" event={"ID":"62f2b65d-d3ac-49ae-b398-f379a6bda788","Type":"ContainerDied","Data":"961c029b7826b53f3e7863c63f1836bee875c60ff23108f70eb6112dd8b9f366"} Oct 06 08:37:53 crc kubenswrapper[4991]: I1006 08:37:53.614951 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:53 crc kubenswrapper[4991]: I1006 08:37:53.628875 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.603401 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" event={"ID":"62f2b65d-d3ac-49ae-b398-f379a6bda788","Type":"ContainerStarted","Data":"22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0"} Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.604103 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.607025 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53c6aca4-4fd0-4d42-bbe2-4b6e91643503","Type":"ContainerStarted","Data":"30d12fe2a09790653c0ec3185c8f8c2cd238090b351db2e10e53d51862d3fb5f"} Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.607713 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.629157 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" podStartSLOduration=-9223372003.225645 podStartE2EDuration="33.629131692s" podCreationTimestamp="2025-10-06 08:37:21 +0000 UTC" firstStartedPulling="2025-10-06 08:37:22.193206014 +0000 UTC m=+1093.930956035" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:37:54.621769374 +0000 UTC m=+1126.359519485" watchObservedRunningTime="2025-10-06 08:37:54.629131692 +0000 UTC m=+1126.366881723" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.657952 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.669934 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.826932 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-v7gwm"] Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.866916 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-q24qh"] Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.868410 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.872746 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.882209 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-q24qh"] Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.904459 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-df86g"] Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.906894 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.913839 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.941860 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-df86g"] Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.981251 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.982962 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.989228 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.989289 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.989405 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 06 08:37:54 crc kubenswrapper[4991]: I1006 08:37:54.989591 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-h9shm" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.016376 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-ovn-rundir\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.016436 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-config\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.016460 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-q24qh\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.016476 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q77j\" (UniqueName: \"kubernetes.io/projected/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-kube-api-access-6q77j\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.016521 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-config\") pod \"dnsmasq-dns-6bc7876d45-q24qh\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.016564 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.016585 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-289wb\" (UniqueName: \"kubernetes.io/projected/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-kube-api-access-289wb\") pod \"dnsmasq-dns-6bc7876d45-q24qh\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.016628 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-combined-ca-bundle\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.016653 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-ovs-rundir\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.016669 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-q24qh\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.016792 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.046265 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lv52b"] Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.047917 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" podUID="1f4ba1fc-cafd-47e7-812a-6041044f864b" containerName="dnsmasq-dns" containerID="cri-o://7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc" gracePeriod=10 Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.065042 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-mwgc9"] Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.066424 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.069241 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.109734 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mwgc9"] Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118077 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118125 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118156 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-combined-ca-bundle\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118174 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-ovs-rundir\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118194 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-q24qh\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118242 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7dbb\" (UniqueName: \"kubernetes.io/projected/51a7066c-5143-43ab-b642-81f461a9c1f4-kube-api-access-r7dbb\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118259 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a7066c-5143-43ab-b642-81f461a9c1f4-config\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118274 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-ovn-rundir\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118317 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-config\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118337 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-q24qh\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118355 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q77j\" (UniqueName: \"kubernetes.io/projected/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-kube-api-access-6q77j\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118379 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51a7066c-5143-43ab-b642-81f461a9c1f4-scripts\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118395 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118424 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-config\") pod \"dnsmasq-dns-6bc7876d45-q24qh\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118443 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118492 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-289wb\" (UniqueName: \"kubernetes.io/projected/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-kube-api-access-289wb\") pod \"dnsmasq-dns-6bc7876d45-q24qh\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118516 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51a7066c-5143-43ab-b642-81f461a9c1f4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118550 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-ovn-rundir\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.118597 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-ovs-rundir\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.119011 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-q24qh\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.120049 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-config\") pod \"dnsmasq-dns-6bc7876d45-q24qh\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.120181 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-config\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.120632 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-q24qh\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.123842 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-combined-ca-bundle\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.139524 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.141610 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-289wb\" (UniqueName: \"kubernetes.io/projected/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-kube-api-access-289wb\") pod \"dnsmasq-dns-6bc7876d45-q24qh\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.142104 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q77j\" (UniqueName: \"kubernetes.io/projected/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-kube-api-access-6q77j\") pod \"ovn-controller-metrics-df86g\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.195036 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.219760 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51a7066c-5143-43ab-b642-81f461a9c1f4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.219822 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhx99\" (UniqueName: \"kubernetes.io/projected/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-kube-api-access-nhx99\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.219842 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-config\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.219863 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.219885 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.219906 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.219941 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7dbb\" (UniqueName: \"kubernetes.io/projected/51a7066c-5143-43ab-b642-81f461a9c1f4-kube-api-access-r7dbb\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.219962 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a7066c-5143-43ab-b642-81f461a9c1f4-config\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.219984 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.220026 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-dns-svc\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.220049 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51a7066c-5143-43ab-b642-81f461a9c1f4-scripts\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.220065 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.221841 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a7066c-5143-43ab-b642-81f461a9c1f4-config\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.222723 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51a7066c-5143-43ab-b642-81f461a9c1f4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.222844 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51a7066c-5143-43ab-b642-81f461a9c1f4-scripts\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.223554 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.224723 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.225469 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.239500 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.247432 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7dbb\" (UniqueName: \"kubernetes.io/projected/51a7066c-5143-43ab-b642-81f461a9c1f4-kube-api-access-r7dbb\") pod \"ovn-northd-0\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.303903 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.321586 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhx99\" (UniqueName: \"kubernetes.io/projected/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-kube-api-access-nhx99\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.321626 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-config\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.321662 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.321726 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.321961 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-dns-svc\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.322651 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-config\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.322669 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.323655 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-dns-svc\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.324417 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.338895 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhx99\" (UniqueName: \"kubernetes.io/projected/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-kube-api-access-nhx99\") pod \"dnsmasq-dns-8554648995-mwgc9\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.505092 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.516274 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.627474 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh92d\" (UniqueName: \"kubernetes.io/projected/1f4ba1fc-cafd-47e7-812a-6041044f864b-kube-api-access-bh92d\") pod \"1f4ba1fc-cafd-47e7-812a-6041044f864b\" (UID: \"1f4ba1fc-cafd-47e7-812a-6041044f864b\") " Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.627809 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f4ba1fc-cafd-47e7-812a-6041044f864b-config\") pod \"1f4ba1fc-cafd-47e7-812a-6041044f864b\" (UID: \"1f4ba1fc-cafd-47e7-812a-6041044f864b\") " Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.627929 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f4ba1fc-cafd-47e7-812a-6041044f864b-dns-svc\") pod \"1f4ba1fc-cafd-47e7-812a-6041044f864b\" (UID: \"1f4ba1fc-cafd-47e7-812a-6041044f864b\") " Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.637235 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f4ba1fc-cafd-47e7-812a-6041044f864b-kube-api-access-bh92d" (OuterVolumeSpecName: "kube-api-access-bh92d") pod "1f4ba1fc-cafd-47e7-812a-6041044f864b" (UID: "1f4ba1fc-cafd-47e7-812a-6041044f864b"). InnerVolumeSpecName "kube-api-access-bh92d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.638107 4991 generic.go:334] "Generic (PLEG): container finished" podID="1f4ba1fc-cafd-47e7-812a-6041044f864b" containerID="7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc" exitCode=0 Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.638159 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" event={"ID":"1f4ba1fc-cafd-47e7-812a-6041044f864b","Type":"ContainerDied","Data":"7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc"} Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.638203 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" event={"ID":"1f4ba1fc-cafd-47e7-812a-6041044f864b","Type":"ContainerDied","Data":"ee49a38d1cbdec6d046749facabe7ed9dafde8ec72c597e06f1eebfdae116dad"} Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.638226 4991 scope.go:117] "RemoveContainer" containerID="7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.638435 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-lv52b" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.674810 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f4ba1fc-cafd-47e7-812a-6041044f864b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f4ba1fc-cafd-47e7-812a-6041044f864b" (UID: "1f4ba1fc-cafd-47e7-812a-6041044f864b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.678219 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f4ba1fc-cafd-47e7-812a-6041044f864b-config" (OuterVolumeSpecName: "config") pod "1f4ba1fc-cafd-47e7-812a-6041044f864b" (UID: "1f4ba1fc-cafd-47e7-812a-6041044f864b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.681636 4991 scope.go:117] "RemoveContainer" containerID="b21a223f2132057244752ca4451d6757a04c50aa0e4d3df32176e63157c95020" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.700639 4991 scope.go:117] "RemoveContainer" containerID="7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc" Oct 06 08:37:55 crc kubenswrapper[4991]: E1006 08:37:55.701768 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc\": container with ID starting with 7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc not found: ID does not exist" containerID="7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.701827 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc"} err="failed to get container status \"7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc\": rpc error: code = NotFound desc = could not find container \"7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc\": container with ID starting with 7fc42f87279117e067d9299c171736b864c3853761998672ec172ba1cc8102fc not found: ID does not exist" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.701861 4991 scope.go:117] "RemoveContainer" containerID="b21a223f2132057244752ca4451d6757a04c50aa0e4d3df32176e63157c95020" Oct 06 08:37:55 crc kubenswrapper[4991]: E1006 08:37:55.702323 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b21a223f2132057244752ca4451d6757a04c50aa0e4d3df32176e63157c95020\": container with ID starting with b21a223f2132057244752ca4451d6757a04c50aa0e4d3df32176e63157c95020 not found: ID does not exist" containerID="b21a223f2132057244752ca4451d6757a04c50aa0e4d3df32176e63157c95020" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.702358 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b21a223f2132057244752ca4451d6757a04c50aa0e4d3df32176e63157c95020"} err="failed to get container status \"b21a223f2132057244752ca4451d6757a04c50aa0e4d3df32176e63157c95020\": rpc error: code = NotFound desc = could not find container \"b21a223f2132057244752ca4451d6757a04c50aa0e4d3df32176e63157c95020\": container with ID starting with b21a223f2132057244752ca4451d6757a04c50aa0e4d3df32176e63157c95020 not found: ID does not exist" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.731413 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f4ba1fc-cafd-47e7-812a-6041044f864b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.731460 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh92d\" (UniqueName: \"kubernetes.io/projected/1f4ba1fc-cafd-47e7-812a-6041044f864b-kube-api-access-bh92d\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.731474 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f4ba1fc-cafd-47e7-812a-6041044f864b-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.744596 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-q24qh"] Oct 06 08:37:55 crc kubenswrapper[4991]: W1006 08:37:55.752481 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f1201ff_e6ae_4e4c_896a_5af8ec2ec518.slice/crio-1a845cb7250d6262564e31f8a2d930709e4976acbc6b9ff1958398b061694f89 WatchSource:0}: Error finding container 1a845cb7250d6262564e31f8a2d930709e4976acbc6b9ff1958398b061694f89: Status 404 returned error can't find the container with id 1a845cb7250d6262564e31f8a2d930709e4976acbc6b9ff1958398b061694f89 Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.860773 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-df86g"] Oct 06 08:37:55 crc kubenswrapper[4991]: W1006 08:37:55.868388 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ad30dfa_4735_4ef3_8fcc_4b6f25eefcd6.slice/crio-557b7992f30f35d6f7674d6a98d19517cd682b267963b24a83c30d22b62d0339 WatchSource:0}: Error finding container 557b7992f30f35d6f7674d6a98d19517cd682b267963b24a83c30d22b62d0339: Status 404 returned error can't find the container with id 557b7992f30f35d6f7674d6a98d19517cd682b267963b24a83c30d22b62d0339 Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.898440 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.898869 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.937824 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 08:37:55 crc kubenswrapper[4991]: W1006 08:37:55.954443 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51a7066c_5143_43ab_b642_81f461a9c1f4.slice/crio-7d92250465135a992c3bbe54f25c5c368684c6fe1917679b89d7d898f54ea694 WatchSource:0}: Error finding container 7d92250465135a992c3bbe54f25c5c368684c6fe1917679b89d7d898f54ea694: Status 404 returned error can't find the container with id 7d92250465135a992c3bbe54f25c5c368684c6fe1917679b89d7d898f54ea694 Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.974315 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lv52b"] Oct 06 08:37:55 crc kubenswrapper[4991]: I1006 08:37:55.979852 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-lv52b"] Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.020515 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mwgc9"] Oct 06 08:37:56 crc kubenswrapper[4991]: W1006 08:37:56.034944 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d325ff9_b68b_455e_bec3_5116ccd9ac8d.slice/crio-9c6862234f1a5cc1d8d62a1e2bbc2bfb3c0ba4e903481b64094fa2576956549e WatchSource:0}: Error finding container 9c6862234f1a5cc1d8d62a1e2bbc2bfb3c0ba4e903481b64094fa2576956549e: Status 404 returned error can't find the container with id 9c6862234f1a5cc1d8d62a1e2bbc2bfb3c0ba4e903481b64094fa2576956549e Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.061280 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.061542 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.107046 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.645923 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-df86g" event={"ID":"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6","Type":"ContainerStarted","Data":"ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb"} Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.646346 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-df86g" event={"ID":"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6","Type":"ContainerStarted","Data":"557b7992f30f35d6f7674d6a98d19517cd682b267963b24a83c30d22b62d0339"} Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.647255 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"51a7066c-5143-43ab-b642-81f461a9c1f4","Type":"ContainerStarted","Data":"7d92250465135a992c3bbe54f25c5c368684c6fe1917679b89d7d898f54ea694"} Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.648820 4991 generic.go:334] "Generic (PLEG): container finished" podID="3d325ff9-b68b-455e-bec3-5116ccd9ac8d" containerID="4e050da419fc186dbb799fb516343cbfc5698c168357853a60cf378fe73f51d0" exitCode=0 Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.648860 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mwgc9" event={"ID":"3d325ff9-b68b-455e-bec3-5116ccd9ac8d","Type":"ContainerDied","Data":"4e050da419fc186dbb799fb516343cbfc5698c168357853a60cf378fe73f51d0"} Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.648876 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mwgc9" event={"ID":"3d325ff9-b68b-455e-bec3-5116ccd9ac8d","Type":"ContainerStarted","Data":"9c6862234f1a5cc1d8d62a1e2bbc2bfb3c0ba4e903481b64094fa2576956549e"} Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.655331 4991 generic.go:334] "Generic (PLEG): container finished" podID="4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" containerID="95e66c5e95b697fe7a5df92b18914b1b4b2832d750601735feb31102bc09d602" exitCode=0 Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.656587 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" event={"ID":"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518","Type":"ContainerDied","Data":"95e66c5e95b697fe7a5df92b18914b1b4b2832d750601735feb31102bc09d602"} Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.656622 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" event={"ID":"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518","Type":"ContainerStarted","Data":"1a845cb7250d6262564e31f8a2d930709e4976acbc6b9ff1958398b061694f89"} Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.657099 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" podUID="62f2b65d-d3ac-49ae-b398-f379a6bda788" containerName="dnsmasq-dns" containerID="cri-o://22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0" gracePeriod=10 Oct 06 08:37:56 crc kubenswrapper[4991]: I1006 08:37:56.682177 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-df86g" podStartSLOduration=2.682160171 podStartE2EDuration="2.682160171s" podCreationTimestamp="2025-10-06 08:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:37:56.679804484 +0000 UTC m=+1128.417554505" watchObservedRunningTime="2025-10-06 08:37:56.682160171 +0000 UTC m=+1128.419910192" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.052852 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.188422 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4ctb\" (UniqueName: \"kubernetes.io/projected/62f2b65d-d3ac-49ae-b398-f379a6bda788-kube-api-access-h4ctb\") pod \"62f2b65d-d3ac-49ae-b398-f379a6bda788\" (UID: \"62f2b65d-d3ac-49ae-b398-f379a6bda788\") " Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.188640 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f2b65d-d3ac-49ae-b398-f379a6bda788-config\") pod \"62f2b65d-d3ac-49ae-b398-f379a6bda788\" (UID: \"62f2b65d-d3ac-49ae-b398-f379a6bda788\") " Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.188743 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f2b65d-d3ac-49ae-b398-f379a6bda788-dns-svc\") pod \"62f2b65d-d3ac-49ae-b398-f379a6bda788\" (UID: \"62f2b65d-d3ac-49ae-b398-f379a6bda788\") " Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.193733 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f2b65d-d3ac-49ae-b398-f379a6bda788-kube-api-access-h4ctb" (OuterVolumeSpecName: "kube-api-access-h4ctb") pod "62f2b65d-d3ac-49ae-b398-f379a6bda788" (UID: "62f2b65d-d3ac-49ae-b398-f379a6bda788"). InnerVolumeSpecName "kube-api-access-h4ctb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.232272 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f2b65d-d3ac-49ae-b398-f379a6bda788-config" (OuterVolumeSpecName: "config") pod "62f2b65d-d3ac-49ae-b398-f379a6bda788" (UID: "62f2b65d-d3ac-49ae-b398-f379a6bda788"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.252590 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f2b65d-d3ac-49ae-b398-f379a6bda788-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62f2b65d-d3ac-49ae-b398-f379a6bda788" (UID: "62f2b65d-d3ac-49ae-b398-f379a6bda788"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.255568 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f4ba1fc-cafd-47e7-812a-6041044f864b" path="/var/lib/kubelet/pods/1f4ba1fc-cafd-47e7-812a-6041044f864b/volumes" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.290670 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f2b65d-d3ac-49ae-b398-f379a6bda788-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.290715 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f2b65d-d3ac-49ae-b398-f379a6bda788-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.290730 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4ctb\" (UniqueName: \"kubernetes.io/projected/62f2b65d-d3ac-49ae-b398-f379a6bda788-kube-api-access-h4ctb\") on node \"crc\" DevicePath \"\"" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.529069 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.529411 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.529454 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.530055 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1369062046a805994e1e0d5f87b5a6e887447735123010879df4c4305faa2ba"} pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.530106 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" containerID="cri-o://e1369062046a805994e1e0d5f87b5a6e887447735123010879df4c4305faa2ba" gracePeriod=600 Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.673248 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mwgc9" event={"ID":"3d325ff9-b68b-455e-bec3-5116ccd9ac8d","Type":"ContainerStarted","Data":"48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962"} Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.673408 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.677922 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" event={"ID":"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518","Type":"ContainerStarted","Data":"7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5"} Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.678049 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.680788 4991 generic.go:334] "Generic (PLEG): container finished" podID="62f2b65d-d3ac-49ae-b398-f379a6bda788" containerID="22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0" exitCode=0 Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.680855 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" event={"ID":"62f2b65d-d3ac-49ae-b398-f379a6bda788","Type":"ContainerDied","Data":"22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0"} Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.680900 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" event={"ID":"62f2b65d-d3ac-49ae-b398-f379a6bda788","Type":"ContainerDied","Data":"9fd6a7fb85cf86beeb8c227ece104795edc80d4a9512f7c186a0631e5d929fa5"} Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.680900 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-v7gwm" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.680923 4991 scope.go:117] "RemoveContainer" containerID="22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.683661 4991 generic.go:334] "Generic (PLEG): container finished" podID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerID="e1369062046a805994e1e0d5f87b5a6e887447735123010879df4c4305faa2ba" exitCode=0 Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.683759 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerDied","Data":"e1369062046a805994e1e0d5f87b5a6e887447735123010879df4c4305faa2ba"} Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.687397 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"51a7066c-5143-43ab-b642-81f461a9c1f4","Type":"ContainerStarted","Data":"fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865"} Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.711253 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-mwgc9" podStartSLOduration=2.711230187 podStartE2EDuration="2.711230187s" podCreationTimestamp="2025-10-06 08:37:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:37:57.691741627 +0000 UTC m=+1129.429491668" watchObservedRunningTime="2025-10-06 08:37:57.711230187 +0000 UTC m=+1129.448980198" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.717444 4991 scope.go:117] "RemoveContainer" containerID="961c029b7826b53f3e7863c63f1836bee875c60ff23108f70eb6112dd8b9f366" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.729403 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-v7gwm"] Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.743053 4991 scope.go:117] "RemoveContainer" containerID="22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.743675 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-v7gwm"] Oct 06 08:37:57 crc kubenswrapper[4991]: E1006 08:37:57.744775 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0\": container with ID starting with 22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0 not found: ID does not exist" containerID="22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.744882 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0"} err="failed to get container status \"22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0\": rpc error: code = NotFound desc = could not find container \"22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0\": container with ID starting with 22d62ec812aec46a04221a5b8371e7ff427d361fa91db20d1efac516eda48ae0 not found: ID does not exist" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.744911 4991 scope.go:117] "RemoveContainer" containerID="961c029b7826b53f3e7863c63f1836bee875c60ff23108f70eb6112dd8b9f366" Oct 06 08:37:57 crc kubenswrapper[4991]: E1006 08:37:57.745253 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"961c029b7826b53f3e7863c63f1836bee875c60ff23108f70eb6112dd8b9f366\": container with ID starting with 961c029b7826b53f3e7863c63f1836bee875c60ff23108f70eb6112dd8b9f366 not found: ID does not exist" containerID="961c029b7826b53f3e7863c63f1836bee875c60ff23108f70eb6112dd8b9f366" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.745281 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961c029b7826b53f3e7863c63f1836bee875c60ff23108f70eb6112dd8b9f366"} err="failed to get container status \"961c029b7826b53f3e7863c63f1836bee875c60ff23108f70eb6112dd8b9f366\": rpc error: code = NotFound desc = could not find container \"961c029b7826b53f3e7863c63f1836bee875c60ff23108f70eb6112dd8b9f366\": container with ID starting with 961c029b7826b53f3e7863c63f1836bee875c60ff23108f70eb6112dd8b9f366 not found: ID does not exist" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.745315 4991 scope.go:117] "RemoveContainer" containerID="ee6239739727eb6d7bb018a70f54ea31ce396adbac7977b5d2326c033722faa0" Oct 06 08:37:57 crc kubenswrapper[4991]: I1006 08:37:57.748950 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" podStartSLOduration=3.748936229 podStartE2EDuration="3.748936229s" podCreationTimestamp="2025-10-06 08:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:37:57.72589534 +0000 UTC m=+1129.463645361" watchObservedRunningTime="2025-10-06 08:37:57.748936229 +0000 UTC m=+1129.486686250" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.281927 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-q24qh"] Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.300823 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.352144 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-7mv7x"] Oct 06 08:37:58 crc kubenswrapper[4991]: E1006 08:37:58.352497 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f2b65d-d3ac-49ae-b398-f379a6bda788" containerName="dnsmasq-dns" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.352539 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f2b65d-d3ac-49ae-b398-f379a6bda788" containerName="dnsmasq-dns" Oct 06 08:37:58 crc kubenswrapper[4991]: E1006 08:37:58.352549 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f4ba1fc-cafd-47e7-812a-6041044f864b" containerName="init" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.352559 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f4ba1fc-cafd-47e7-812a-6041044f864b" containerName="init" Oct 06 08:37:58 crc kubenswrapper[4991]: E1006 08:37:58.352577 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f2b65d-d3ac-49ae-b398-f379a6bda788" containerName="init" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.352584 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f2b65d-d3ac-49ae-b398-f379a6bda788" containerName="init" Oct 06 08:37:58 crc kubenswrapper[4991]: E1006 08:37:58.352597 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f4ba1fc-cafd-47e7-812a-6041044f864b" containerName="dnsmasq-dns" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.352604 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f4ba1fc-cafd-47e7-812a-6041044f864b" containerName="dnsmasq-dns" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.352791 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f2b65d-d3ac-49ae-b398-f379a6bda788" containerName="dnsmasq-dns" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.352813 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f4ba1fc-cafd-47e7-812a-6041044f864b" containerName="dnsmasq-dns" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.353817 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.359197 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-7mv7x"] Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.528103 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.528238 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wshm\" (UniqueName: \"kubernetes.io/projected/240e42d0-244a-4222-9c12-131cb5ab3be6-kube-api-access-6wshm\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.528268 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-config\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.528366 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.528429 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.629964 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.630007 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.630071 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wshm\" (UniqueName: \"kubernetes.io/projected/240e42d0-244a-4222-9c12-131cb5ab3be6-kube-api-access-6wshm\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.630088 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-config\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.630140 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.630962 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.631018 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.631090 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.631330 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-config\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.651367 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wshm\" (UniqueName: \"kubernetes.io/projected/240e42d0-244a-4222-9c12-131cb5ab3be6-kube-api-access-6wshm\") pod \"dnsmasq-dns-b8fbc5445-7mv7x\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.671955 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.705079 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"588bca8d19a8065db7c6c040db1c1694b8c7daffc697ab9a2f8788b4b3c06abd"} Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.708193 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"51a7066c-5143-43ab-b642-81f461a9c1f4","Type":"ContainerStarted","Data":"9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d"} Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.708232 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 06 08:37:58 crc kubenswrapper[4991]: I1006 08:37:58.752517 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.329963946 podStartE2EDuration="4.752502067s" podCreationTimestamp="2025-10-06 08:37:54 +0000 UTC" firstStartedPulling="2025-10-06 08:37:55.95618981 +0000 UTC m=+1127.693939831" lastFinishedPulling="2025-10-06 08:37:57.378727931 +0000 UTC m=+1129.116477952" observedRunningTime="2025-10-06 08:37:58.748946136 +0000 UTC m=+1130.486696167" watchObservedRunningTime="2025-10-06 08:37:58.752502067 +0000 UTC m=+1130.490252088" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.116901 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-7mv7x"] Oct 06 08:37:59 crc kubenswrapper[4991]: W1006 08:37:59.124372 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod240e42d0_244a_4222_9c12_131cb5ab3be6.slice/crio-1f30410270acfb60d074f9f1d955ca744f2f24a67e9db55a258a818e5b1a2f87 WatchSource:0}: Error finding container 1f30410270acfb60d074f9f1d955ca744f2f24a67e9db55a258a818e5b1a2f87: Status 404 returned error can't find the container with id 1f30410270acfb60d074f9f1d955ca744f2f24a67e9db55a258a818e5b1a2f87 Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.259363 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62f2b65d-d3ac-49ae-b398-f379a6bda788" path="/var/lib/kubelet/pods/62f2b65d-d3ac-49ae-b398-f379a6bda788/volumes" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.478190 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.487693 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.508207 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.508750 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.509233 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rd5n8" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.509438 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.521140 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.646162 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22hq5\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-kube-api-access-22hq5\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.646530 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.646721 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/14cb118a-286e-4ded-890d-fc788f9361f4-lock\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.646830 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/14cb118a-286e-4ded-890d-fc788f9361f4-cache\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.646943 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.715200 4991 generic.go:334] "Generic (PLEG): container finished" podID="240e42d0-244a-4222-9c12-131cb5ab3be6" containerID="fc7bacd93ce7271acf251424e765f158d74e9ba61a9ebf9aee437f2e3ae4d07c" exitCode=0 Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.715246 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" event={"ID":"240e42d0-244a-4222-9c12-131cb5ab3be6","Type":"ContainerDied","Data":"fc7bacd93ce7271acf251424e765f158d74e9ba61a9ebf9aee437f2e3ae4d07c"} Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.716213 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" event={"ID":"240e42d0-244a-4222-9c12-131cb5ab3be6","Type":"ContainerStarted","Data":"1f30410270acfb60d074f9f1d955ca744f2f24a67e9db55a258a818e5b1a2f87"} Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.716478 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" podUID="4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" containerName="dnsmasq-dns" containerID="cri-o://7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5" gracePeriod=10 Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.747956 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/14cb118a-286e-4ded-890d-fc788f9361f4-lock\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.748003 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/14cb118a-286e-4ded-890d-fc788f9361f4-cache\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.748033 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.748065 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22hq5\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-kube-api-access-22hq5\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.748131 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: E1006 08:37:59.748280 4991 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 08:37:59 crc kubenswrapper[4991]: E1006 08:37:59.748308 4991 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 08:37:59 crc kubenswrapper[4991]: E1006 08:37:59.748354 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift podName:14cb118a-286e-4ded-890d-fc788f9361f4 nodeName:}" failed. No retries permitted until 2025-10-06 08:38:00.248338086 +0000 UTC m=+1131.986088117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift") pod "swift-storage-0" (UID: "14cb118a-286e-4ded-890d-fc788f9361f4") : configmap "swift-ring-files" not found Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.748426 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/14cb118a-286e-4ded-890d-fc788f9361f4-lock\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.748465 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/14cb118a-286e-4ded-890d-fc788f9361f4-cache\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.748722 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.772995 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22hq5\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-kube-api-access-22hq5\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:37:59 crc kubenswrapper[4991]: I1006 08:37:59.779190 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.008404 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.063476 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.155587 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.257491 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-ovsdbserver-sb\") pod \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.257580 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-289wb\" (UniqueName: \"kubernetes.io/projected/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-kube-api-access-289wb\") pod \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.257645 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-dns-svc\") pod \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.257679 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-config\") pod \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\" (UID: \"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518\") " Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.257977 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:38:00 crc kubenswrapper[4991]: E1006 08:38:00.258105 4991 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 08:38:00 crc kubenswrapper[4991]: E1006 08:38:00.258122 4991 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 08:38:00 crc kubenswrapper[4991]: E1006 08:38:00.258163 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift podName:14cb118a-286e-4ded-890d-fc788f9361f4 nodeName:}" failed. No retries permitted until 2025-10-06 08:38:01.258147781 +0000 UTC m=+1132.995897802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift") pod "swift-storage-0" (UID: "14cb118a-286e-4ded-890d-fc788f9361f4") : configmap "swift-ring-files" not found Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.263531 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-kube-api-access-289wb" (OuterVolumeSpecName: "kube-api-access-289wb") pod "4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" (UID: "4f1201ff-e6ae-4e4c-896a-5af8ec2ec518"). InnerVolumeSpecName "kube-api-access-289wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.306345 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" (UID: "4f1201ff-e6ae-4e4c-896a-5af8ec2ec518"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.306909 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" (UID: "4f1201ff-e6ae-4e4c-896a-5af8ec2ec518"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.319049 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-config" (OuterVolumeSpecName: "config") pod "4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" (UID: "4f1201ff-e6ae-4e4c-896a-5af8ec2ec518"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.361485 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.361517 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-289wb\" (UniqueName: \"kubernetes.io/projected/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-kube-api-access-289wb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.361534 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.361545 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.725358 4991 generic.go:334] "Generic (PLEG): container finished" podID="4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" containerID="7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5" exitCode=0 Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.725404 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" event={"ID":"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518","Type":"ContainerDied","Data":"7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5"} Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.725705 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" event={"ID":"4f1201ff-e6ae-4e4c-896a-5af8ec2ec518","Type":"ContainerDied","Data":"1a845cb7250d6262564e31f8a2d930709e4976acbc6b9ff1958398b061694f89"} Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.725724 4991 scope.go:117] "RemoveContainer" containerID="7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.725416 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-q24qh" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.727949 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" event={"ID":"240e42d0-244a-4222-9c12-131cb5ab3be6","Type":"ContainerStarted","Data":"ccbb42e55502e4eee5bc56b5016c4468ab2acf943f34b02551c08340b32bb150"} Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.746934 4991 scope.go:117] "RemoveContainer" containerID="95e66c5e95b697fe7a5df92b18914b1b4b2832d750601735feb31102bc09d602" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.751757 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" podStartSLOduration=2.751735448 podStartE2EDuration="2.751735448s" podCreationTimestamp="2025-10-06 08:37:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:00.747566721 +0000 UTC m=+1132.485316832" watchObservedRunningTime="2025-10-06 08:38:00.751735448 +0000 UTC m=+1132.489485459" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.763729 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-q24qh"] Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.769543 4991 scope.go:117] "RemoveContainer" containerID="7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5" Oct 06 08:38:00 crc kubenswrapper[4991]: E1006 08:38:00.769965 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5\": container with ID starting with 7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5 not found: ID does not exist" containerID="7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.770001 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5"} err="failed to get container status \"7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5\": rpc error: code = NotFound desc = could not find container \"7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5\": container with ID starting with 7839d093936015817e09c9f900554e1aceff0a827154eabb22a06b9e98f8b6f5 not found: ID does not exist" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.770031 4991 scope.go:117] "RemoveContainer" containerID="95e66c5e95b697fe7a5df92b18914b1b4b2832d750601735feb31102bc09d602" Oct 06 08:38:00 crc kubenswrapper[4991]: E1006 08:38:00.770361 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95e66c5e95b697fe7a5df92b18914b1b4b2832d750601735feb31102bc09d602\": container with ID starting with 95e66c5e95b697fe7a5df92b18914b1b4b2832d750601735feb31102bc09d602 not found: ID does not exist" containerID="95e66c5e95b697fe7a5df92b18914b1b4b2832d750601735feb31102bc09d602" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.770399 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95e66c5e95b697fe7a5df92b18914b1b4b2832d750601735feb31102bc09d602"} err="failed to get container status \"95e66c5e95b697fe7a5df92b18914b1b4b2832d750601735feb31102bc09d602\": rpc error: code = NotFound desc = could not find container \"95e66c5e95b697fe7a5df92b18914b1b4b2832d750601735feb31102bc09d602\": container with ID starting with 95e66c5e95b697fe7a5df92b18914b1b4b2832d750601735feb31102bc09d602 not found: ID does not exist" Oct 06 08:38:00 crc kubenswrapper[4991]: I1006 08:38:00.770416 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-q24qh"] Oct 06 08:38:01 crc kubenswrapper[4991]: I1006 08:38:01.258247 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" path="/var/lib/kubelet/pods/4f1201ff-e6ae-4e4c-896a-5af8ec2ec518/volumes" Oct 06 08:38:01 crc kubenswrapper[4991]: I1006 08:38:01.276439 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:38:01 crc kubenswrapper[4991]: E1006 08:38:01.276669 4991 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 08:38:01 crc kubenswrapper[4991]: E1006 08:38:01.276696 4991 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 08:38:01 crc kubenswrapper[4991]: E1006 08:38:01.276780 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift podName:14cb118a-286e-4ded-890d-fc788f9361f4 nodeName:}" failed. No retries permitted until 2025-10-06 08:38:03.276762683 +0000 UTC m=+1135.014512714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift") pod "swift-storage-0" (UID: "14cb118a-286e-4ded-890d-fc788f9361f4") : configmap "swift-ring-files" not found Oct 06 08:38:01 crc kubenswrapper[4991]: I1006 08:38:01.739558 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:38:02 crc kubenswrapper[4991]: I1006 08:38:02.133441 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 06 08:38:02 crc kubenswrapper[4991]: I1006 08:38:02.207732 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.316819 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:38:03 crc kubenswrapper[4991]: E1006 08:38:03.317118 4991 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 08:38:03 crc kubenswrapper[4991]: E1006 08:38:03.317510 4991 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 08:38:03 crc kubenswrapper[4991]: E1006 08:38:03.317583 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift podName:14cb118a-286e-4ded-890d-fc788f9361f4 nodeName:}" failed. No retries permitted until 2025-10-06 08:38:07.317561847 +0000 UTC m=+1139.055311878 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift") pod "swift-storage-0" (UID: "14cb118a-286e-4ded-890d-fc788f9361f4") : configmap "swift-ring-files" not found Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.380253 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mfsxd"] Oct 06 08:38:03 crc kubenswrapper[4991]: E1006 08:38:03.380589 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" containerName="dnsmasq-dns" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.380607 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" containerName="dnsmasq-dns" Oct 06 08:38:03 crc kubenswrapper[4991]: E1006 08:38:03.380644 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" containerName="init" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.380652 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" containerName="init" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.380851 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1201ff-e6ae-4e4c-896a-5af8ec2ec518" containerName="dnsmasq-dns" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.381432 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.384705 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.385380 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.385659 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.407193 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-mfsxd"] Oct 06 08:38:03 crc kubenswrapper[4991]: E1006 08:38:03.407949 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-svk9v ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-svk9v ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-mfsxd" podUID="7bbc8d66-1e1d-4427-86cb-eb01c221d037" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.433305 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-gc8kg"] Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.434379 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.440483 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gc8kg"] Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.447014 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-mfsxd"] Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.520952 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-combined-ca-bundle\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521082 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bbc8d66-1e1d-4427-86cb-eb01c221d037-etc-swift\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521128 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bbc8d66-1e1d-4427-86cb-eb01c221d037-scripts\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521214 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bbc8d66-1e1d-4427-86cb-eb01c221d037-ring-data-devices\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521286 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-swiftconf\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521355 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqxgn\" (UniqueName: \"kubernetes.io/projected/a462bacd-997b-4e65-89d3-1db409e5b26b-kube-api-access-rqxgn\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521410 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a462bacd-997b-4e65-89d3-1db409e5b26b-etc-swift\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521590 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a462bacd-997b-4e65-89d3-1db409e5b26b-ring-data-devices\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521643 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-combined-ca-bundle\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521683 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-dispersionconf\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521704 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-dispersionconf\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521732 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-swiftconf\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521851 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svk9v\" (UniqueName: \"kubernetes.io/projected/7bbc8d66-1e1d-4427-86cb-eb01c221d037-kube-api-access-svk9v\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.521956 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a462bacd-997b-4e65-89d3-1db409e5b26b-scripts\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.623988 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-swiftconf\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.624203 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svk9v\" (UniqueName: \"kubernetes.io/projected/7bbc8d66-1e1d-4427-86cb-eb01c221d037-kube-api-access-svk9v\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.624314 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a462bacd-997b-4e65-89d3-1db409e5b26b-scripts\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.624360 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-combined-ca-bundle\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.624395 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bbc8d66-1e1d-4427-86cb-eb01c221d037-etc-swift\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.624437 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bbc8d66-1e1d-4427-86cb-eb01c221d037-scripts\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.624476 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bbc8d66-1e1d-4427-86cb-eb01c221d037-ring-data-devices\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.624529 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-swiftconf\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.624903 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqxgn\" (UniqueName: \"kubernetes.io/projected/a462bacd-997b-4e65-89d3-1db409e5b26b-kube-api-access-rqxgn\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.625078 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bbc8d66-1e1d-4427-86cb-eb01c221d037-etc-swift\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.625411 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bbc8d66-1e1d-4427-86cb-eb01c221d037-scripts\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.625414 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a462bacd-997b-4e65-89d3-1db409e5b26b-scripts\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.625696 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bbc8d66-1e1d-4427-86cb-eb01c221d037-ring-data-devices\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.625757 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a462bacd-997b-4e65-89d3-1db409e5b26b-etc-swift\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.625928 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a462bacd-997b-4e65-89d3-1db409e5b26b-ring-data-devices\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.625966 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-combined-ca-bundle\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.626024 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-dispersionconf\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.626056 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-dispersionconf\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.626098 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a462bacd-997b-4e65-89d3-1db409e5b26b-etc-swift\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.627694 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a462bacd-997b-4e65-89d3-1db409e5b26b-ring-data-devices\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.630818 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-dispersionconf\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.631118 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-dispersionconf\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.631839 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-swiftconf\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.632783 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-combined-ca-bundle\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.635032 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-swiftconf\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.635405 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-combined-ca-bundle\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.642234 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svk9v\" (UniqueName: \"kubernetes.io/projected/7bbc8d66-1e1d-4427-86cb-eb01c221d037-kube-api-access-svk9v\") pod \"swift-ring-rebalance-mfsxd\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.644082 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqxgn\" (UniqueName: \"kubernetes.io/projected/a462bacd-997b-4e65-89d3-1db409e5b26b-kube-api-access-rqxgn\") pod \"swift-ring-rebalance-gc8kg\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.758165 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.758803 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.775551 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.829552 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bbc8d66-1e1d-4427-86cb-eb01c221d037-ring-data-devices\") pod \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.829633 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-swiftconf\") pod \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.829658 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-dispersionconf\") pod \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.829700 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svk9v\" (UniqueName: \"kubernetes.io/projected/7bbc8d66-1e1d-4427-86cb-eb01c221d037-kube-api-access-svk9v\") pod \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.829721 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bbc8d66-1e1d-4427-86cb-eb01c221d037-etc-swift\") pod \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.829772 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-combined-ca-bundle\") pod \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.829833 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bbc8d66-1e1d-4427-86cb-eb01c221d037-scripts\") pod \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\" (UID: \"7bbc8d66-1e1d-4427-86cb-eb01c221d037\") " Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.830660 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bbc8d66-1e1d-4427-86cb-eb01c221d037-scripts" (OuterVolumeSpecName: "scripts") pod "7bbc8d66-1e1d-4427-86cb-eb01c221d037" (UID: "7bbc8d66-1e1d-4427-86cb-eb01c221d037"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.831291 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bbc8d66-1e1d-4427-86cb-eb01c221d037-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7bbc8d66-1e1d-4427-86cb-eb01c221d037" (UID: "7bbc8d66-1e1d-4427-86cb-eb01c221d037"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.832142 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbc8d66-1e1d-4427-86cb-eb01c221d037-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7bbc8d66-1e1d-4427-86cb-eb01c221d037" (UID: "7bbc8d66-1e1d-4427-86cb-eb01c221d037"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.835776 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbc8d66-1e1d-4427-86cb-eb01c221d037-kube-api-access-svk9v" (OuterVolumeSpecName: "kube-api-access-svk9v") pod "7bbc8d66-1e1d-4427-86cb-eb01c221d037" (UID: "7bbc8d66-1e1d-4427-86cb-eb01c221d037"). InnerVolumeSpecName "kube-api-access-svk9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.835874 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7bbc8d66-1e1d-4427-86cb-eb01c221d037" (UID: "7bbc8d66-1e1d-4427-86cb-eb01c221d037"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.835991 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bbc8d66-1e1d-4427-86cb-eb01c221d037" (UID: "7bbc8d66-1e1d-4427-86cb-eb01c221d037"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.857539 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7bbc8d66-1e1d-4427-86cb-eb01c221d037" (UID: "7bbc8d66-1e1d-4427-86cb-eb01c221d037"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.932973 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bbc8d66-1e1d-4427-86cb-eb01c221d037-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.933009 4991 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7bbc8d66-1e1d-4427-86cb-eb01c221d037-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.933028 4991 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.933040 4991 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.933051 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svk9v\" (UniqueName: \"kubernetes.io/projected/7bbc8d66-1e1d-4427-86cb-eb01c221d037-kube-api-access-svk9v\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.933063 4991 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7bbc8d66-1e1d-4427-86cb-eb01c221d037-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:03 crc kubenswrapper[4991]: I1006 08:38:03.933074 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbc8d66-1e1d-4427-86cb-eb01c221d037-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:04 crc kubenswrapper[4991]: I1006 08:38:04.197315 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gc8kg"] Oct 06 08:38:04 crc kubenswrapper[4991]: W1006 08:38:04.204542 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda462bacd_997b_4e65_89d3_1db409e5b26b.slice/crio-62e3cb7af9ace0405a9953e5d2dc758f9a90a442d90c9d6c205bd3abe6aa1342 WatchSource:0}: Error finding container 62e3cb7af9ace0405a9953e5d2dc758f9a90a442d90c9d6c205bd3abe6aa1342: Status 404 returned error can't find the container with id 62e3cb7af9ace0405a9953e5d2dc758f9a90a442d90c9d6c205bd3abe6aa1342 Oct 06 08:38:04 crc kubenswrapper[4991]: I1006 08:38:04.766700 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gc8kg" event={"ID":"a462bacd-997b-4e65-89d3-1db409e5b26b","Type":"ContainerStarted","Data":"62e3cb7af9ace0405a9953e5d2dc758f9a90a442d90c9d6c205bd3abe6aa1342"} Oct 06 08:38:04 crc kubenswrapper[4991]: I1006 08:38:04.766726 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mfsxd" Oct 06 08:38:04 crc kubenswrapper[4991]: I1006 08:38:04.821236 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-mfsxd"] Oct 06 08:38:04 crc kubenswrapper[4991]: I1006 08:38:04.830116 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-mfsxd"] Oct 06 08:38:05 crc kubenswrapper[4991]: I1006 08:38:05.253791 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbc8d66-1e1d-4427-86cb-eb01c221d037" path="/var/lib/kubelet/pods/7bbc8d66-1e1d-4427-86cb-eb01c221d037/volumes" Oct 06 08:38:05 crc kubenswrapper[4991]: I1006 08:38:05.506496 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:38:05 crc kubenswrapper[4991]: I1006 08:38:05.766652 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4qt96"] Oct 06 08:38:05 crc kubenswrapper[4991]: I1006 08:38:05.767884 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4qt96" Oct 06 08:38:05 crc kubenswrapper[4991]: I1006 08:38:05.774174 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4qt96"] Oct 06 08:38:05 crc kubenswrapper[4991]: I1006 08:38:05.865731 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzktv\" (UniqueName: \"kubernetes.io/projected/674d98a6-e32d-47c6-bf03-4ecdc611beb4-kube-api-access-bzktv\") pod \"keystone-db-create-4qt96\" (UID: \"674d98a6-e32d-47c6-bf03-4ecdc611beb4\") " pod="openstack/keystone-db-create-4qt96" Oct 06 08:38:05 crc kubenswrapper[4991]: I1006 08:38:05.961457 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-4bh74"] Oct 06 08:38:05 crc kubenswrapper[4991]: I1006 08:38:05.962495 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4bh74" Oct 06 08:38:05 crc kubenswrapper[4991]: I1006 08:38:05.967502 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzktv\" (UniqueName: \"kubernetes.io/projected/674d98a6-e32d-47c6-bf03-4ecdc611beb4-kube-api-access-bzktv\") pod \"keystone-db-create-4qt96\" (UID: \"674d98a6-e32d-47c6-bf03-4ecdc611beb4\") " pod="openstack/keystone-db-create-4qt96" Oct 06 08:38:05 crc kubenswrapper[4991]: I1006 08:38:05.976914 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4bh74"] Oct 06 08:38:05 crc kubenswrapper[4991]: I1006 08:38:05.989681 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzktv\" (UniqueName: \"kubernetes.io/projected/674d98a6-e32d-47c6-bf03-4ecdc611beb4-kube-api-access-bzktv\") pod \"keystone-db-create-4qt96\" (UID: \"674d98a6-e32d-47c6-bf03-4ecdc611beb4\") " pod="openstack/keystone-db-create-4qt96" Oct 06 08:38:06 crc kubenswrapper[4991]: I1006 08:38:06.069091 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9447\" (UniqueName: \"kubernetes.io/projected/96a84386-eade-4e4b-a569-0dbce5dc6081-kube-api-access-v9447\") pod \"placement-db-create-4bh74\" (UID: \"96a84386-eade-4e4b-a569-0dbce5dc6081\") " pod="openstack/placement-db-create-4bh74" Oct 06 08:38:06 crc kubenswrapper[4991]: I1006 08:38:06.088584 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4qt96" Oct 06 08:38:06 crc kubenswrapper[4991]: I1006 08:38:06.171053 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9447\" (UniqueName: \"kubernetes.io/projected/96a84386-eade-4e4b-a569-0dbce5dc6081-kube-api-access-v9447\") pod \"placement-db-create-4bh74\" (UID: \"96a84386-eade-4e4b-a569-0dbce5dc6081\") " pod="openstack/placement-db-create-4bh74" Oct 06 08:38:06 crc kubenswrapper[4991]: I1006 08:38:06.186978 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9447\" (UniqueName: \"kubernetes.io/projected/96a84386-eade-4e4b-a569-0dbce5dc6081-kube-api-access-v9447\") pod \"placement-db-create-4bh74\" (UID: \"96a84386-eade-4e4b-a569-0dbce5dc6081\") " pod="openstack/placement-db-create-4bh74" Oct 06 08:38:06 crc kubenswrapper[4991]: I1006 08:38:06.238686 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lglj7"] Oct 06 08:38:06 crc kubenswrapper[4991]: I1006 08:38:06.239977 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lglj7" Oct 06 08:38:06 crc kubenswrapper[4991]: I1006 08:38:06.249829 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lglj7"] Oct 06 08:38:06 crc kubenswrapper[4991]: I1006 08:38:06.286115 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4bh74" Oct 06 08:38:06 crc kubenswrapper[4991]: I1006 08:38:06.374370 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cjt2\" (UniqueName: \"kubernetes.io/projected/a3fead61-55ea-433e-afe9-983c17ef5cdf-kube-api-access-8cjt2\") pod \"glance-db-create-lglj7\" (UID: \"a3fead61-55ea-433e-afe9-983c17ef5cdf\") " pod="openstack/glance-db-create-lglj7" Oct 06 08:38:06 crc kubenswrapper[4991]: I1006 08:38:06.476722 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cjt2\" (UniqueName: \"kubernetes.io/projected/a3fead61-55ea-433e-afe9-983c17ef5cdf-kube-api-access-8cjt2\") pod \"glance-db-create-lglj7\" (UID: \"a3fead61-55ea-433e-afe9-983c17ef5cdf\") " pod="openstack/glance-db-create-lglj7" Oct 06 08:38:06 crc kubenswrapper[4991]: I1006 08:38:06.497229 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cjt2\" (UniqueName: \"kubernetes.io/projected/a3fead61-55ea-433e-afe9-983c17ef5cdf-kube-api-access-8cjt2\") pod \"glance-db-create-lglj7\" (UID: \"a3fead61-55ea-433e-afe9-983c17ef5cdf\") " pod="openstack/glance-db-create-lglj7" Oct 06 08:38:06 crc kubenswrapper[4991]: I1006 08:38:06.564182 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lglj7" Oct 06 08:38:07 crc kubenswrapper[4991]: I1006 08:38:07.398028 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:38:07 crc kubenswrapper[4991]: E1006 08:38:07.398272 4991 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 08:38:07 crc kubenswrapper[4991]: E1006 08:38:07.398420 4991 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 08:38:07 crc kubenswrapper[4991]: E1006 08:38:07.398473 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift podName:14cb118a-286e-4ded-890d-fc788f9361f4 nodeName:}" failed. No retries permitted until 2025-10-06 08:38:15.398459014 +0000 UTC m=+1147.136209035 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift") pod "swift-storage-0" (UID: "14cb118a-286e-4ded-890d-fc788f9361f4") : configmap "swift-ring-files" not found Oct 06 08:38:07 crc kubenswrapper[4991]: I1006 08:38:07.785773 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lglj7"] Oct 06 08:38:07 crc kubenswrapper[4991]: W1006 08:38:07.787239 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3fead61_55ea_433e_afe9_983c17ef5cdf.slice/crio-e11b210aa57ffc33150580a762659b82a79cd61a3b6196bf5505710f9b1d20e5 WatchSource:0}: Error finding container e11b210aa57ffc33150580a762659b82a79cd61a3b6196bf5505710f9b1d20e5: Status 404 returned error can't find the container with id e11b210aa57ffc33150580a762659b82a79cd61a3b6196bf5505710f9b1d20e5 Oct 06 08:38:07 crc kubenswrapper[4991]: I1006 08:38:07.799233 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gc8kg" event={"ID":"a462bacd-997b-4e65-89d3-1db409e5b26b","Type":"ContainerStarted","Data":"8c2f621a06879a2c3c612a06ac045c0a16c5be885f62705d7fbfffbef118ca1e"} Oct 06 08:38:07 crc kubenswrapper[4991]: I1006 08:38:07.848565 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-gc8kg" podStartSLOduration=1.650684537 podStartE2EDuration="4.848540465s" podCreationTimestamp="2025-10-06 08:38:03 +0000 UTC" firstStartedPulling="2025-10-06 08:38:04.206867373 +0000 UTC m=+1135.944617394" lastFinishedPulling="2025-10-06 08:38:07.404723301 +0000 UTC m=+1139.142473322" observedRunningTime="2025-10-06 08:38:07.825486344 +0000 UTC m=+1139.563236375" watchObservedRunningTime="2025-10-06 08:38:07.848540465 +0000 UTC m=+1139.586290486" Oct 06 08:38:07 crc kubenswrapper[4991]: I1006 08:38:07.859091 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4bh74"] Oct 06 08:38:07 crc kubenswrapper[4991]: W1006 08:38:07.859442 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod674d98a6_e32d_47c6_bf03_4ecdc611beb4.slice/crio-b9d585446a3256b5273ef692c8980b8bdb18ad22dc320ec238ee2765bbfa8c4f WatchSource:0}: Error finding container b9d585446a3256b5273ef692c8980b8bdb18ad22dc320ec238ee2765bbfa8c4f: Status 404 returned error can't find the container with id b9d585446a3256b5273ef692c8980b8bdb18ad22dc320ec238ee2765bbfa8c4f Oct 06 08:38:07 crc kubenswrapper[4991]: I1006 08:38:07.865411 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4qt96"] Oct 06 08:38:08 crc kubenswrapper[4991]: I1006 08:38:08.673782 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:38:08 crc kubenswrapper[4991]: I1006 08:38:08.740681 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mwgc9"] Oct 06 08:38:08 crc kubenswrapper[4991]: I1006 08:38:08.741873 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-mwgc9" podUID="3d325ff9-b68b-455e-bec3-5116ccd9ac8d" containerName="dnsmasq-dns" containerID="cri-o://48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962" gracePeriod=10 Oct 06 08:38:08 crc kubenswrapper[4991]: I1006 08:38:08.808593 4991 generic.go:334] "Generic (PLEG): container finished" podID="674d98a6-e32d-47c6-bf03-4ecdc611beb4" containerID="5d0ad4dbaf7672bd759ce4b1d4274210a70349250211f5e7c2592266f9db5df1" exitCode=0 Oct 06 08:38:08 crc kubenswrapper[4991]: I1006 08:38:08.809039 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4qt96" event={"ID":"674d98a6-e32d-47c6-bf03-4ecdc611beb4","Type":"ContainerDied","Data":"5d0ad4dbaf7672bd759ce4b1d4274210a70349250211f5e7c2592266f9db5df1"} Oct 06 08:38:08 crc kubenswrapper[4991]: I1006 08:38:08.809077 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4qt96" event={"ID":"674d98a6-e32d-47c6-bf03-4ecdc611beb4","Type":"ContainerStarted","Data":"b9d585446a3256b5273ef692c8980b8bdb18ad22dc320ec238ee2765bbfa8c4f"} Oct 06 08:38:08 crc kubenswrapper[4991]: I1006 08:38:08.812627 4991 generic.go:334] "Generic (PLEG): container finished" podID="a3fead61-55ea-433e-afe9-983c17ef5cdf" containerID="5b3441530c80e9311844d2033726d1a48cc481c7a9c1cf589a46dacfd173501f" exitCode=0 Oct 06 08:38:08 crc kubenswrapper[4991]: I1006 08:38:08.812708 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lglj7" event={"ID":"a3fead61-55ea-433e-afe9-983c17ef5cdf","Type":"ContainerDied","Data":"5b3441530c80e9311844d2033726d1a48cc481c7a9c1cf589a46dacfd173501f"} Oct 06 08:38:08 crc kubenswrapper[4991]: I1006 08:38:08.812768 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lglj7" event={"ID":"a3fead61-55ea-433e-afe9-983c17ef5cdf","Type":"ContainerStarted","Data":"e11b210aa57ffc33150580a762659b82a79cd61a3b6196bf5505710f9b1d20e5"} Oct 06 08:38:08 crc kubenswrapper[4991]: I1006 08:38:08.814713 4991 generic.go:334] "Generic (PLEG): container finished" podID="96a84386-eade-4e4b-a569-0dbce5dc6081" containerID="d2a68d324a56519f7999d2aa245e7844322a98233c289cf351e1037336ccc2f5" exitCode=0 Oct 06 08:38:08 crc kubenswrapper[4991]: I1006 08:38:08.815557 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4bh74" event={"ID":"96a84386-eade-4e4b-a569-0dbce5dc6081","Type":"ContainerDied","Data":"d2a68d324a56519f7999d2aa245e7844322a98233c289cf351e1037336ccc2f5"} Oct 06 08:38:08 crc kubenswrapper[4991]: I1006 08:38:08.815593 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4bh74" event={"ID":"96a84386-eade-4e4b-a569-0dbce5dc6081","Type":"ContainerStarted","Data":"d8055c85b30bb86536539ea493c23697c32968226f1a1f9f4eb81237a1cbecaa"} Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.256140 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.351783 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-config\") pod \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.351844 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-ovsdbserver-sb\") pod \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.352015 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-ovsdbserver-nb\") pod \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.352035 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhx99\" (UniqueName: \"kubernetes.io/projected/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-kube-api-access-nhx99\") pod \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.352057 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-dns-svc\") pod \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\" (UID: \"3d325ff9-b68b-455e-bec3-5116ccd9ac8d\") " Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.357877 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-kube-api-access-nhx99" (OuterVolumeSpecName: "kube-api-access-nhx99") pod "3d325ff9-b68b-455e-bec3-5116ccd9ac8d" (UID: "3d325ff9-b68b-455e-bec3-5116ccd9ac8d"). InnerVolumeSpecName "kube-api-access-nhx99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.391961 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d325ff9-b68b-455e-bec3-5116ccd9ac8d" (UID: "3d325ff9-b68b-455e-bec3-5116ccd9ac8d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.409800 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d325ff9-b68b-455e-bec3-5116ccd9ac8d" (UID: "3d325ff9-b68b-455e-bec3-5116ccd9ac8d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.409937 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-config" (OuterVolumeSpecName: "config") pod "3d325ff9-b68b-455e-bec3-5116ccd9ac8d" (UID: "3d325ff9-b68b-455e-bec3-5116ccd9ac8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.412371 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d325ff9-b68b-455e-bec3-5116ccd9ac8d" (UID: "3d325ff9-b68b-455e-bec3-5116ccd9ac8d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.455465 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.455509 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.455524 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.455536 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhx99\" (UniqueName: \"kubernetes.io/projected/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-kube-api-access-nhx99\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.455549 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d325ff9-b68b-455e-bec3-5116ccd9ac8d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.822435 4991 generic.go:334] "Generic (PLEG): container finished" podID="3d325ff9-b68b-455e-bec3-5116ccd9ac8d" containerID="48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962" exitCode=0 Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.822484 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mwgc9" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.822522 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mwgc9" event={"ID":"3d325ff9-b68b-455e-bec3-5116ccd9ac8d","Type":"ContainerDied","Data":"48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962"} Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.822548 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mwgc9" event={"ID":"3d325ff9-b68b-455e-bec3-5116ccd9ac8d","Type":"ContainerDied","Data":"9c6862234f1a5cc1d8d62a1e2bbc2bfb3c0ba4e903481b64094fa2576956549e"} Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.822566 4991 scope.go:117] "RemoveContainer" containerID="48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.855525 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mwgc9"] Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.858846 4991 scope.go:117] "RemoveContainer" containerID="4e050da419fc186dbb799fb516343cbfc5698c168357853a60cf378fe73f51d0" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.862660 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mwgc9"] Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.914474 4991 scope.go:117] "RemoveContainer" containerID="48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962" Oct 06 08:38:09 crc kubenswrapper[4991]: E1006 08:38:09.918624 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962\": container with ID starting with 48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962 not found: ID does not exist" containerID="48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.918690 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962"} err="failed to get container status \"48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962\": rpc error: code = NotFound desc = could not find container \"48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962\": container with ID starting with 48d317296c2cca378de5bc73da331e71d31fbd69aa1f122bbd1ed6b64719d962 not found: ID does not exist" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.918717 4991 scope.go:117] "RemoveContainer" containerID="4e050da419fc186dbb799fb516343cbfc5698c168357853a60cf378fe73f51d0" Oct 06 08:38:09 crc kubenswrapper[4991]: E1006 08:38:09.922052 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e050da419fc186dbb799fb516343cbfc5698c168357853a60cf378fe73f51d0\": container with ID starting with 4e050da419fc186dbb799fb516343cbfc5698c168357853a60cf378fe73f51d0 not found: ID does not exist" containerID="4e050da419fc186dbb799fb516343cbfc5698c168357853a60cf378fe73f51d0" Oct 06 08:38:09 crc kubenswrapper[4991]: I1006 08:38:09.922093 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e050da419fc186dbb799fb516343cbfc5698c168357853a60cf378fe73f51d0"} err="failed to get container status \"4e050da419fc186dbb799fb516343cbfc5698c168357853a60cf378fe73f51d0\": rpc error: code = NotFound desc = could not find container \"4e050da419fc186dbb799fb516343cbfc5698c168357853a60cf378fe73f51d0\": container with ID starting with 4e050da419fc186dbb799fb516343cbfc5698c168357853a60cf378fe73f51d0 not found: ID does not exist" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.162806 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4bh74" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.196107 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4qt96" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.211947 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lglj7" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.269337 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9447\" (UniqueName: \"kubernetes.io/projected/96a84386-eade-4e4b-a569-0dbce5dc6081-kube-api-access-v9447\") pod \"96a84386-eade-4e4b-a569-0dbce5dc6081\" (UID: \"96a84386-eade-4e4b-a569-0dbce5dc6081\") " Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.275259 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a84386-eade-4e4b-a569-0dbce5dc6081-kube-api-access-v9447" (OuterVolumeSpecName: "kube-api-access-v9447") pod "96a84386-eade-4e4b-a569-0dbce5dc6081" (UID: "96a84386-eade-4e4b-a569-0dbce5dc6081"). InnerVolumeSpecName "kube-api-access-v9447". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.365679 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.371463 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzktv\" (UniqueName: \"kubernetes.io/projected/674d98a6-e32d-47c6-bf03-4ecdc611beb4-kube-api-access-bzktv\") pod \"674d98a6-e32d-47c6-bf03-4ecdc611beb4\" (UID: \"674d98a6-e32d-47c6-bf03-4ecdc611beb4\") " Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.371582 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cjt2\" (UniqueName: \"kubernetes.io/projected/a3fead61-55ea-433e-afe9-983c17ef5cdf-kube-api-access-8cjt2\") pod \"a3fead61-55ea-433e-afe9-983c17ef5cdf\" (UID: \"a3fead61-55ea-433e-afe9-983c17ef5cdf\") " Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.372116 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9447\" (UniqueName: \"kubernetes.io/projected/96a84386-eade-4e4b-a569-0dbce5dc6081-kube-api-access-v9447\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.374439 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674d98a6-e32d-47c6-bf03-4ecdc611beb4-kube-api-access-bzktv" (OuterVolumeSpecName: "kube-api-access-bzktv") pod "674d98a6-e32d-47c6-bf03-4ecdc611beb4" (UID: "674d98a6-e32d-47c6-bf03-4ecdc611beb4"). InnerVolumeSpecName "kube-api-access-bzktv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.376714 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3fead61-55ea-433e-afe9-983c17ef5cdf-kube-api-access-8cjt2" (OuterVolumeSpecName: "kube-api-access-8cjt2") pod "a3fead61-55ea-433e-afe9-983c17ef5cdf" (UID: "a3fead61-55ea-433e-afe9-983c17ef5cdf"). InnerVolumeSpecName "kube-api-access-8cjt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.474821 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cjt2\" (UniqueName: \"kubernetes.io/projected/a3fead61-55ea-433e-afe9-983c17ef5cdf-kube-api-access-8cjt2\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.474892 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzktv\" (UniqueName: \"kubernetes.io/projected/674d98a6-e32d-47c6-bf03-4ecdc611beb4-kube-api-access-bzktv\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.832458 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4bh74" event={"ID":"96a84386-eade-4e4b-a569-0dbce5dc6081","Type":"ContainerDied","Data":"d8055c85b30bb86536539ea493c23697c32968226f1a1f9f4eb81237a1cbecaa"} Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.832511 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8055c85b30bb86536539ea493c23697c32968226f1a1f9f4eb81237a1cbecaa" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.832480 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4bh74" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.838626 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4qt96" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.838627 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4qt96" event={"ID":"674d98a6-e32d-47c6-bf03-4ecdc611beb4","Type":"ContainerDied","Data":"b9d585446a3256b5273ef692c8980b8bdb18ad22dc320ec238ee2765bbfa8c4f"} Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.838673 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9d585446a3256b5273ef692c8980b8bdb18ad22dc320ec238ee2765bbfa8c4f" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.841048 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lglj7" event={"ID":"a3fead61-55ea-433e-afe9-983c17ef5cdf","Type":"ContainerDied","Data":"e11b210aa57ffc33150580a762659b82a79cd61a3b6196bf5505710f9b1d20e5"} Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.841090 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e11b210aa57ffc33150580a762659b82a79cd61a3b6196bf5505710f9b1d20e5" Oct 06 08:38:10 crc kubenswrapper[4991]: I1006 08:38:10.841143 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lglj7" Oct 06 08:38:11 crc kubenswrapper[4991]: I1006 08:38:11.254243 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d325ff9-b68b-455e-bec3-5116ccd9ac8d" path="/var/lib/kubelet/pods/3d325ff9-b68b-455e-bec3-5116ccd9ac8d/volumes" Oct 06 08:38:14 crc kubenswrapper[4991]: I1006 08:38:14.892600 4991 generic.go:334] "Generic (PLEG): container finished" podID="a462bacd-997b-4e65-89d3-1db409e5b26b" containerID="8c2f621a06879a2c3c612a06ac045c0a16c5be885f62705d7fbfffbef118ca1e" exitCode=0 Oct 06 08:38:14 crc kubenswrapper[4991]: I1006 08:38:14.892671 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gc8kg" event={"ID":"a462bacd-997b-4e65-89d3-1db409e5b26b","Type":"ContainerDied","Data":"8c2f621a06879a2c3c612a06ac045c0a16c5be885f62705d7fbfffbef118ca1e"} Oct 06 08:38:15 crc kubenswrapper[4991]: I1006 08:38:15.487020 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:38:15 crc kubenswrapper[4991]: I1006 08:38:15.505559 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift\") pod \"swift-storage-0\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " pod="openstack/swift-storage-0" Oct 06 08:38:15 crc kubenswrapper[4991]: I1006 08:38:15.805026 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.303526 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.368803 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-15db-account-create-j2zlh"] Oct 06 08:38:16 crc kubenswrapper[4991]: E1006 08:38:16.369187 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d325ff9-b68b-455e-bec3-5116ccd9ac8d" containerName="dnsmasq-dns" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.369209 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d325ff9-b68b-455e-bec3-5116ccd9ac8d" containerName="dnsmasq-dns" Oct 06 08:38:16 crc kubenswrapper[4991]: E1006 08:38:16.369228 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a462bacd-997b-4e65-89d3-1db409e5b26b" containerName="swift-ring-rebalance" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.369237 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a462bacd-997b-4e65-89d3-1db409e5b26b" containerName="swift-ring-rebalance" Oct 06 08:38:16 crc kubenswrapper[4991]: E1006 08:38:16.369261 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3fead61-55ea-433e-afe9-983c17ef5cdf" containerName="mariadb-database-create" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.369269 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fead61-55ea-433e-afe9-983c17ef5cdf" containerName="mariadb-database-create" Oct 06 08:38:16 crc kubenswrapper[4991]: E1006 08:38:16.369282 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674d98a6-e32d-47c6-bf03-4ecdc611beb4" containerName="mariadb-database-create" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.369290 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="674d98a6-e32d-47c6-bf03-4ecdc611beb4" containerName="mariadb-database-create" Oct 06 08:38:16 crc kubenswrapper[4991]: E1006 08:38:16.369306 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d325ff9-b68b-455e-bec3-5116ccd9ac8d" containerName="init" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.369315 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d325ff9-b68b-455e-bec3-5116ccd9ac8d" containerName="init" Oct 06 08:38:16 crc kubenswrapper[4991]: E1006 08:38:16.369327 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a84386-eade-4e4b-a569-0dbce5dc6081" containerName="mariadb-database-create" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.369335 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a84386-eade-4e4b-a569-0dbce5dc6081" containerName="mariadb-database-create" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.369564 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="674d98a6-e32d-47c6-bf03-4ecdc611beb4" containerName="mariadb-database-create" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.369580 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3fead61-55ea-433e-afe9-983c17ef5cdf" containerName="mariadb-database-create" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.369598 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d325ff9-b68b-455e-bec3-5116ccd9ac8d" containerName="dnsmasq-dns" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.369623 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a462bacd-997b-4e65-89d3-1db409e5b26b" containerName="swift-ring-rebalance" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.369633 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a84386-eade-4e4b-a569-0dbce5dc6081" containerName="mariadb-database-create" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.370260 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-15db-account-create-j2zlh" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.372641 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.379141 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-15db-account-create-j2zlh"] Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.399123 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a462bacd-997b-4e65-89d3-1db409e5b26b-etc-swift\") pod \"a462bacd-997b-4e65-89d3-1db409e5b26b\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.399200 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a462bacd-997b-4e65-89d3-1db409e5b26b-ring-data-devices\") pod \"a462bacd-997b-4e65-89d3-1db409e5b26b\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.399250 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-dispersionconf\") pod \"a462bacd-997b-4e65-89d3-1db409e5b26b\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.399324 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqxgn\" (UniqueName: \"kubernetes.io/projected/a462bacd-997b-4e65-89d3-1db409e5b26b-kube-api-access-rqxgn\") pod \"a462bacd-997b-4e65-89d3-1db409e5b26b\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.399384 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-swiftconf\") pod \"a462bacd-997b-4e65-89d3-1db409e5b26b\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.399416 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a462bacd-997b-4e65-89d3-1db409e5b26b-scripts\") pod \"a462bacd-997b-4e65-89d3-1db409e5b26b\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.399439 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-combined-ca-bundle\") pod \"a462bacd-997b-4e65-89d3-1db409e5b26b\" (UID: \"a462bacd-997b-4e65-89d3-1db409e5b26b\") " Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.399884 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt7tt\" (UniqueName: \"kubernetes.io/projected/75d9de29-7b2c-4544-8516-fe61912e4da9-kube-api-access-lt7tt\") pod \"glance-15db-account-create-j2zlh\" (UID: \"75d9de29-7b2c-4544-8516-fe61912e4da9\") " pod="openstack/glance-15db-account-create-j2zlh" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.401126 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a462bacd-997b-4e65-89d3-1db409e5b26b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a462bacd-997b-4e65-89d3-1db409e5b26b" (UID: "a462bacd-997b-4e65-89d3-1db409e5b26b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.401119 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a462bacd-997b-4e65-89d3-1db409e5b26b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a462bacd-997b-4e65-89d3-1db409e5b26b" (UID: "a462bacd-997b-4e65-89d3-1db409e5b26b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.407586 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a462bacd-997b-4e65-89d3-1db409e5b26b-kube-api-access-rqxgn" (OuterVolumeSpecName: "kube-api-access-rqxgn") pod "a462bacd-997b-4e65-89d3-1db409e5b26b" (UID: "a462bacd-997b-4e65-89d3-1db409e5b26b"). InnerVolumeSpecName "kube-api-access-rqxgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.410732 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a462bacd-997b-4e65-89d3-1db409e5b26b" (UID: "a462bacd-997b-4e65-89d3-1db409e5b26b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.423569 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a462bacd-997b-4e65-89d3-1db409e5b26b-scripts" (OuterVolumeSpecName: "scripts") pod "a462bacd-997b-4e65-89d3-1db409e5b26b" (UID: "a462bacd-997b-4e65-89d3-1db409e5b26b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.425123 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a462bacd-997b-4e65-89d3-1db409e5b26b" (UID: "a462bacd-997b-4e65-89d3-1db409e5b26b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.429494 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a462bacd-997b-4e65-89d3-1db409e5b26b" (UID: "a462bacd-997b-4e65-89d3-1db409e5b26b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.450506 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 06 08:38:16 crc kubenswrapper[4991]: W1006 08:38:16.451632 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14cb118a_286e_4ded_890d_fc788f9361f4.slice/crio-0009a524b9b82e8e3d21213b28a78520227e9ba17988a0f3fbb02000a6be9944 WatchSource:0}: Error finding container 0009a524b9b82e8e3d21213b28a78520227e9ba17988a0f3fbb02000a6be9944: Status 404 returned error can't find the container with id 0009a524b9b82e8e3d21213b28a78520227e9ba17988a0f3fbb02000a6be9944 Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.500602 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt7tt\" (UniqueName: \"kubernetes.io/projected/75d9de29-7b2c-4544-8516-fe61912e4da9-kube-api-access-lt7tt\") pod \"glance-15db-account-create-j2zlh\" (UID: \"75d9de29-7b2c-4544-8516-fe61912e4da9\") " pod="openstack/glance-15db-account-create-j2zlh" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.500758 4991 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a462bacd-997b-4e65-89d3-1db409e5b26b-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.500779 4991 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a462bacd-997b-4e65-89d3-1db409e5b26b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.500795 4991 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.500810 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqxgn\" (UniqueName: \"kubernetes.io/projected/a462bacd-997b-4e65-89d3-1db409e5b26b-kube-api-access-rqxgn\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.500826 4991 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.500840 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a462bacd-997b-4e65-89d3-1db409e5b26b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.500855 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a462bacd-997b-4e65-89d3-1db409e5b26b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.520100 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt7tt\" (UniqueName: \"kubernetes.io/projected/75d9de29-7b2c-4544-8516-fe61912e4da9-kube-api-access-lt7tt\") pod \"glance-15db-account-create-j2zlh\" (UID: \"75d9de29-7b2c-4544-8516-fe61912e4da9\") " pod="openstack/glance-15db-account-create-j2zlh" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.689198 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-15db-account-create-j2zlh" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.908237 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"0009a524b9b82e8e3d21213b28a78520227e9ba17988a0f3fbb02000a6be9944"} Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.909583 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gc8kg" event={"ID":"a462bacd-997b-4e65-89d3-1db409e5b26b","Type":"ContainerDied","Data":"62e3cb7af9ace0405a9953e5d2dc758f9a90a442d90c9d6c205bd3abe6aa1342"} Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.909606 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62e3cb7af9ace0405a9953e5d2dc758f9a90a442d90c9d6c205bd3abe6aa1342" Oct 06 08:38:16 crc kubenswrapper[4991]: I1006 08:38:16.909657 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gc8kg" Oct 06 08:38:17 crc kubenswrapper[4991]: I1006 08:38:17.143417 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-15db-account-create-j2zlh"] Oct 06 08:38:17 crc kubenswrapper[4991]: I1006 08:38:17.918334 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-15db-account-create-j2zlh" event={"ID":"75d9de29-7b2c-4544-8516-fe61912e4da9","Type":"ContainerDied","Data":"ca13f7ecc36df43a2b7566361dbd728e4e7c12b4be68e74f14a5ac1f6960d766"} Oct 06 08:38:17 crc kubenswrapper[4991]: I1006 08:38:17.918281 4991 generic.go:334] "Generic (PLEG): container finished" podID="75d9de29-7b2c-4544-8516-fe61912e4da9" containerID="ca13f7ecc36df43a2b7566361dbd728e4e7c12b4be68e74f14a5ac1f6960d766" exitCode=0 Oct 06 08:38:17 crc kubenswrapper[4991]: I1006 08:38:17.918641 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-15db-account-create-j2zlh" event={"ID":"75d9de29-7b2c-4544-8516-fe61912e4da9","Type":"ContainerStarted","Data":"1cbcd5883f0a0d63164b99894c4969c5c121fcc7a026cbdec71ea881288c6901"} Oct 06 08:38:19 crc kubenswrapper[4991]: I1006 08:38:19.259562 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-15db-account-create-j2zlh" Oct 06 08:38:19 crc kubenswrapper[4991]: I1006 08:38:19.443005 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt7tt\" (UniqueName: \"kubernetes.io/projected/75d9de29-7b2c-4544-8516-fe61912e4da9-kube-api-access-lt7tt\") pod \"75d9de29-7b2c-4544-8516-fe61912e4da9\" (UID: \"75d9de29-7b2c-4544-8516-fe61912e4da9\") " Oct 06 08:38:19 crc kubenswrapper[4991]: I1006 08:38:19.452654 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d9de29-7b2c-4544-8516-fe61912e4da9-kube-api-access-lt7tt" (OuterVolumeSpecName: "kube-api-access-lt7tt") pod "75d9de29-7b2c-4544-8516-fe61912e4da9" (UID: "75d9de29-7b2c-4544-8516-fe61912e4da9"). InnerVolumeSpecName "kube-api-access-lt7tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:19 crc kubenswrapper[4991]: I1006 08:38:19.545197 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt7tt\" (UniqueName: \"kubernetes.io/projected/75d9de29-7b2c-4544-8516-fe61912e4da9-kube-api-access-lt7tt\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:19 crc kubenswrapper[4991]: I1006 08:38:19.933251 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"ed12c4a932f30894215eff330feb00b02897cadb829ca357ed1fd45e5afdf1b3"} Oct 06 08:38:19 crc kubenswrapper[4991]: I1006 08:38:19.934589 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-15db-account-create-j2zlh" event={"ID":"75d9de29-7b2c-4544-8516-fe61912e4da9","Type":"ContainerDied","Data":"1cbcd5883f0a0d63164b99894c4969c5c121fcc7a026cbdec71ea881288c6901"} Oct 06 08:38:19 crc kubenswrapper[4991]: I1006 08:38:19.934611 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cbcd5883f0a0d63164b99894c4969c5c121fcc7a026cbdec71ea881288c6901" Oct 06 08:38:19 crc kubenswrapper[4991]: I1006 08:38:19.934645 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-15db-account-create-j2zlh" Oct 06 08:38:20 crc kubenswrapper[4991]: I1006 08:38:20.948651 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"3b537ff709c1788e201f7be5c9872d032b3f628ae4187cee84bd9ddc9645c96c"} Oct 06 08:38:20 crc kubenswrapper[4991]: I1006 08:38:20.948992 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"cacc49468ee93ceabe894ccc8d50085a9655611b6c4501bf305bb67771d140e5"} Oct 06 08:38:20 crc kubenswrapper[4991]: I1006 08:38:20.949003 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"c9ef1fa176e4762e4800cf8c17d38583018327434b1f427f17c6368143ce1443"} Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.582717 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-b5jwb"] Oct 06 08:38:21 crc kubenswrapper[4991]: E1006 08:38:21.583129 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d9de29-7b2c-4544-8516-fe61912e4da9" containerName="mariadb-account-create" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.583153 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d9de29-7b2c-4544-8516-fe61912e4da9" containerName="mariadb-account-create" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.583402 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d9de29-7b2c-4544-8516-fe61912e4da9" containerName="mariadb-account-create" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.584092 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.586215 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.586760 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lt5hb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.589824 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b5jwb"] Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.681576 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-combined-ca-bundle\") pod \"glance-db-sync-b5jwb\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.681619 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-db-sync-config-data\") pod \"glance-db-sync-b5jwb\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.681661 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-config-data\") pod \"glance-db-sync-b5jwb\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.681695 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg56h\" (UniqueName: \"kubernetes.io/projected/c473952b-d738-4c47-a5e2-c6f827ff4730-kube-api-access-lg56h\") pod \"glance-db-sync-b5jwb\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.783969 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-config-data\") pod \"glance-db-sync-b5jwb\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.784054 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg56h\" (UniqueName: \"kubernetes.io/projected/c473952b-d738-4c47-a5e2-c6f827ff4730-kube-api-access-lg56h\") pod \"glance-db-sync-b5jwb\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.784147 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-combined-ca-bundle\") pod \"glance-db-sync-b5jwb\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.784193 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-db-sync-config-data\") pod \"glance-db-sync-b5jwb\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.790049 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-db-sync-config-data\") pod \"glance-db-sync-b5jwb\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.790143 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-combined-ca-bundle\") pod \"glance-db-sync-b5jwb\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.801748 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg56h\" (UniqueName: \"kubernetes.io/projected/c473952b-d738-4c47-a5e2-c6f827ff4730-kube-api-access-lg56h\") pod \"glance-db-sync-b5jwb\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.805165 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-config-data\") pod \"glance-db-sync-b5jwb\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:21 crc kubenswrapper[4991]: I1006 08:38:21.906259 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.123442 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-jklxx" podUID="188f566f-7d4a-4b9f-b74d-bbee761c0bea" containerName="ovn-controller" probeResult="failure" output=< Oct 06 08:38:22 crc kubenswrapper[4991]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 06 08:38:22 crc kubenswrapper[4991]: > Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.127697 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.139622 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.347880 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jklxx-config-lwmv9"] Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.349139 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.351833 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.363681 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jklxx-config-lwmv9"] Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.450301 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b5jwb"] Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.495949 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3c490a3-7c93-4c6c-bc83-73c4997b6492-scripts\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.496015 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a3c490a3-7c93-4c6c-bc83-73c4997b6492-additional-scripts\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.496043 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-run-ovn\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.496064 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-run\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.496099 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tmvx\" (UniqueName: \"kubernetes.io/projected/a3c490a3-7c93-4c6c-bc83-73c4997b6492-kube-api-access-7tmvx\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.496152 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-log-ovn\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.597693 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a3c490a3-7c93-4c6c-bc83-73c4997b6492-additional-scripts\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.597748 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-run-ovn\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.597770 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-run\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.597802 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tmvx\" (UniqueName: \"kubernetes.io/projected/a3c490a3-7c93-4c6c-bc83-73c4997b6492-kube-api-access-7tmvx\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.597858 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-log-ovn\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.597905 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3c490a3-7c93-4c6c-bc83-73c4997b6492-scripts\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.598021 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-run\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.598051 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-log-ovn\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.598059 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-run-ovn\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.598699 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a3c490a3-7c93-4c6c-bc83-73c4997b6492-additional-scripts\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.599814 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3c490a3-7c93-4c6c-bc83-73c4997b6492-scripts\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.618789 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tmvx\" (UniqueName: \"kubernetes.io/projected/a3c490a3-7c93-4c6c-bc83-73c4997b6492-kube-api-access-7tmvx\") pod \"ovn-controller-jklxx-config-lwmv9\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.684740 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.963683 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b5jwb" event={"ID":"c473952b-d738-4c47-a5e2-c6f827ff4730","Type":"ContainerStarted","Data":"8dd6cfd82c5aa3dc5e2d1990beb9ad6107aeba30e1f5b77e5e9ba10f8dde60d6"} Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.968104 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"aebf96364238cb6b3d252db6049f87fc6c27dc0650a174ecda7b2742358b2979"} Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.968135 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"cc510399cff86b9534906da4fd4dfb566ffc21c65dc3e7a29de4d1e16e9e7f7a"} Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.968146 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"abaa2e04344e35bc84fdbd617310659cf3403a7924fe6ea867f216abcc6fa8c7"} Oct 06 08:38:22 crc kubenswrapper[4991]: I1006 08:38:22.968157 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"18a56e04769a024151f561f4820a607601164263d72a0ba3ba3c5a8eb7b72631"} Oct 06 08:38:23 crc kubenswrapper[4991]: I1006 08:38:23.091129 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jklxx-config-lwmv9"] Oct 06 08:38:23 crc kubenswrapper[4991]: I1006 08:38:23.978735 4991 generic.go:334] "Generic (PLEG): container finished" podID="a3c490a3-7c93-4c6c-bc83-73c4997b6492" containerID="297d9cbbd1aa92c82c12d9afb0a240a23ded5b725c38bc8c336fbecf56a3f52f" exitCode=0 Oct 06 08:38:23 crc kubenswrapper[4991]: I1006 08:38:23.978807 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jklxx-config-lwmv9" event={"ID":"a3c490a3-7c93-4c6c-bc83-73c4997b6492","Type":"ContainerDied","Data":"297d9cbbd1aa92c82c12d9afb0a240a23ded5b725c38bc8c336fbecf56a3f52f"} Oct 06 08:38:23 crc kubenswrapper[4991]: I1006 08:38:23.979291 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jklxx-config-lwmv9" event={"ID":"a3c490a3-7c93-4c6c-bc83-73c4997b6492","Type":"ContainerStarted","Data":"3eca72f37aafd2f3d269b78d292f114b43172be1b151c9d4cd80296a2469a898"} Oct 06 08:38:24 crc kubenswrapper[4991]: I1006 08:38:24.993939 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"264ceea5be73f445fe8809bba7e4a58faeb85d87ce005ae2e2337c4fbd772807"} Oct 06 08:38:24 crc kubenswrapper[4991]: I1006 08:38:24.994627 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"8619a7be0d8b8d3e157358434fab68c5d39a5c107bae0e507da39b55321787f9"} Oct 06 08:38:24 crc kubenswrapper[4991]: I1006 08:38:24.994639 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"fac75ff26b47c3f0e62bea6d62aa82cb9e5265892c9bea171fe5b4d799545d4b"} Oct 06 08:38:24 crc kubenswrapper[4991]: I1006 08:38:24.994647 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"d9388ecf0c6db1afc9baa8762ef9460101639492f4059916a5452baf6ce1da9b"} Oct 06 08:38:24 crc kubenswrapper[4991]: I1006 08:38:24.998075 4991 generic.go:334] "Generic (PLEG): container finished" podID="1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" containerID="545338f0b083a0c7cbfb9d9da6676198ff08693a5b75a48c77eeafe59d4fe381" exitCode=0 Oct 06 08:38:24 crc kubenswrapper[4991]: I1006 08:38:24.998222 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1","Type":"ContainerDied","Data":"545338f0b083a0c7cbfb9d9da6676198ff08693a5b75a48c77eeafe59d4fe381"} Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.385405 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.550693 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tmvx\" (UniqueName: \"kubernetes.io/projected/a3c490a3-7c93-4c6c-bc83-73c4997b6492-kube-api-access-7tmvx\") pod \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.550828 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a3c490a3-7c93-4c6c-bc83-73c4997b6492-additional-scripts\") pod \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.550946 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-log-ovn\") pod \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.550966 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-run\") pod \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.550990 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3c490a3-7c93-4c6c-bc83-73c4997b6492-scripts\") pod \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.551016 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-run-ovn\") pod \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\" (UID: \"a3c490a3-7c93-4c6c-bc83-73c4997b6492\") " Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.551030 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a3c490a3-7c93-4c6c-bc83-73c4997b6492" (UID: "a3c490a3-7c93-4c6c-bc83-73c4997b6492"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.551094 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-run" (OuterVolumeSpecName: "var-run") pod "a3c490a3-7c93-4c6c-bc83-73c4997b6492" (UID: "a3c490a3-7c93-4c6c-bc83-73c4997b6492"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.551302 4991 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.551331 4991 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.551372 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a3c490a3-7c93-4c6c-bc83-73c4997b6492" (UID: "a3c490a3-7c93-4c6c-bc83-73c4997b6492"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.551756 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c490a3-7c93-4c6c-bc83-73c4997b6492-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a3c490a3-7c93-4c6c-bc83-73c4997b6492" (UID: "a3c490a3-7c93-4c6c-bc83-73c4997b6492"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.552725 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c490a3-7c93-4c6c-bc83-73c4997b6492-scripts" (OuterVolumeSpecName: "scripts") pod "a3c490a3-7c93-4c6c-bc83-73c4997b6492" (UID: "a3c490a3-7c93-4c6c-bc83-73c4997b6492"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.556690 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c490a3-7c93-4c6c-bc83-73c4997b6492-kube-api-access-7tmvx" (OuterVolumeSpecName: "kube-api-access-7tmvx") pod "a3c490a3-7c93-4c6c-bc83-73c4997b6492" (UID: "a3c490a3-7c93-4c6c-bc83-73c4997b6492"). InnerVolumeSpecName "kube-api-access-7tmvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.652275 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3c490a3-7c93-4c6c-bc83-73c4997b6492-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.652576 4991 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3c490a3-7c93-4c6c-bc83-73c4997b6492-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.652591 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tmvx\" (UniqueName: \"kubernetes.io/projected/a3c490a3-7c93-4c6c-bc83-73c4997b6492-kube-api-access-7tmvx\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.652601 4991 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a3c490a3-7c93-4c6c-bc83-73c4997b6492-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.780399 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1358-account-create-tthq2"] Oct 06 08:38:25 crc kubenswrapper[4991]: E1006 08:38:25.780709 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c490a3-7c93-4c6c-bc83-73c4997b6492" containerName="ovn-config" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.780723 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c490a3-7c93-4c6c-bc83-73c4997b6492" containerName="ovn-config" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.780878 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c490a3-7c93-4c6c-bc83-73c4997b6492" containerName="ovn-config" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.781366 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1358-account-create-tthq2" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.785652 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.792140 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1358-account-create-tthq2"] Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.956828 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwzj2\" (UniqueName: \"kubernetes.io/projected/d5ee6cf5-c041-47cf-aace-ccc53c7d2092-kube-api-access-mwzj2\") pod \"keystone-1358-account-create-tthq2\" (UID: \"d5ee6cf5-c041-47cf-aace-ccc53c7d2092\") " pod="openstack/keystone-1358-account-create-tthq2" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.972172 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d82d-account-create-27rsm"] Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.973185 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d82d-account-create-27rsm" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.974912 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 06 08:38:25 crc kubenswrapper[4991]: I1006 08:38:25.985540 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d82d-account-create-27rsm"] Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.008391 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jklxx-config-lwmv9" event={"ID":"a3c490a3-7c93-4c6c-bc83-73c4997b6492","Type":"ContainerDied","Data":"3eca72f37aafd2f3d269b78d292f114b43172be1b151c9d4cd80296a2469a898"} Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.008433 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eca72f37aafd2f3d269b78d292f114b43172be1b151c9d4cd80296a2469a898" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.008495 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jklxx-config-lwmv9" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.023091 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1","Type":"ContainerStarted","Data":"3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab"} Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.023283 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.039074 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"eb28a1e65b323917d5e53d7d3619b4b0894ce6380fa661067a656f0faf1a3966"} Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.039108 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"25950ee93c182d2a8f2b482674bcf125f0ce2007882775e9431029f2d5153184"} Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.039118 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerStarted","Data":"662006c1a00d0cac716c8677f83ad79a7b88245c89d3c05d4a41987440c0babd"} Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.047731 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.607995026 podStartE2EDuration="1m5.047714419s" podCreationTimestamp="2025-10-06 08:37:21 +0000 UTC" firstStartedPulling="2025-10-06 08:37:23.595943465 +0000 UTC m=+1095.333693486" lastFinishedPulling="2025-10-06 08:37:51.035662838 +0000 UTC m=+1122.773412879" observedRunningTime="2025-10-06 08:38:26.046435673 +0000 UTC m=+1157.784185694" watchObservedRunningTime="2025-10-06 08:38:26.047714419 +0000 UTC m=+1157.785464440" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.059178 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwzj2\" (UniqueName: \"kubernetes.io/projected/d5ee6cf5-c041-47cf-aace-ccc53c7d2092-kube-api-access-mwzj2\") pod \"keystone-1358-account-create-tthq2\" (UID: \"d5ee6cf5-c041-47cf-aace-ccc53c7d2092\") " pod="openstack/keystone-1358-account-create-tthq2" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.095991 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwzj2\" (UniqueName: \"kubernetes.io/projected/d5ee6cf5-c041-47cf-aace-ccc53c7d2092-kube-api-access-mwzj2\") pod \"keystone-1358-account-create-tthq2\" (UID: \"d5ee6cf5-c041-47cf-aace-ccc53c7d2092\") " pod="openstack/keystone-1358-account-create-tthq2" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.100039 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1358-account-create-tthq2" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.160514 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svd6h\" (UniqueName: \"kubernetes.io/projected/cb178ee5-a91d-4778-96f8-03cac37c55f5-kube-api-access-svd6h\") pod \"placement-d82d-account-create-27rsm\" (UID: \"cb178ee5-a91d-4778-96f8-03cac37c55f5\") " pod="openstack/placement-d82d-account-create-27rsm" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.262085 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svd6h\" (UniqueName: \"kubernetes.io/projected/cb178ee5-a91d-4778-96f8-03cac37c55f5-kube-api-access-svd6h\") pod \"placement-d82d-account-create-27rsm\" (UID: \"cb178ee5-a91d-4778-96f8-03cac37c55f5\") " pod="openstack/placement-d82d-account-create-27rsm" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.280078 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svd6h\" (UniqueName: \"kubernetes.io/projected/cb178ee5-a91d-4778-96f8-03cac37c55f5-kube-api-access-svd6h\") pod \"placement-d82d-account-create-27rsm\" (UID: \"cb178ee5-a91d-4778-96f8-03cac37c55f5\") " pod="openstack/placement-d82d-account-create-27rsm" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.293590 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d82d-account-create-27rsm" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.341725 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.588105163 podStartE2EDuration="28.341703728s" podCreationTimestamp="2025-10-06 08:37:58 +0000 UTC" firstStartedPulling="2025-10-06 08:38:16.453596447 +0000 UTC m=+1148.191346468" lastFinishedPulling="2025-10-06 08:38:24.207195012 +0000 UTC m=+1155.944945033" observedRunningTime="2025-10-06 08:38:26.094971441 +0000 UTC m=+1157.832721462" watchObservedRunningTime="2025-10-06 08:38:26.341703728 +0000 UTC m=+1158.079453749" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.344332 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-8h5bh"] Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.345968 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.351296 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.418194 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-8h5bh"] Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.476400 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.476457 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.476518 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.476572 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-config\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.476604 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.476647 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tgqt\" (UniqueName: \"kubernetes.io/projected/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-kube-api-access-4tgqt\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.479385 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1358-account-create-tthq2"] Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.513867 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jklxx-config-lwmv9"] Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.518613 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-jklxx-config-lwmv9"] Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.579362 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.579615 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-config\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.579641 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.579671 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tgqt\" (UniqueName: \"kubernetes.io/projected/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-kube-api-access-4tgqt\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.579715 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.579739 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.580437 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.580674 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.581176 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.581639 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.581862 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-config\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.607359 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tgqt\" (UniqueName: \"kubernetes.io/projected/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-kube-api-access-4tgqt\") pod \"dnsmasq-dns-6d5b6d6b67-8h5bh\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.664041 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d82d-account-create-27rsm"] Oct 06 08:38:26 crc kubenswrapper[4991]: I1006 08:38:26.712503 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:27 crc kubenswrapper[4991]: I1006 08:38:27.055934 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d82d-account-create-27rsm" event={"ID":"cb178ee5-a91d-4778-96f8-03cac37c55f5","Type":"ContainerStarted","Data":"da83e83e66021684e9dadbeb740e9d7bfc886bf1d746ab2487aacf93a55a5577"} Oct 06 08:38:27 crc kubenswrapper[4991]: I1006 08:38:27.056334 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d82d-account-create-27rsm" event={"ID":"cb178ee5-a91d-4778-96f8-03cac37c55f5","Type":"ContainerStarted","Data":"2df5937bec6c1e80c71cb889cdc2ab379d589dad485ad66c036af8977c21fa38"} Oct 06 08:38:27 crc kubenswrapper[4991]: I1006 08:38:27.066676 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1358-account-create-tthq2" event={"ID":"d5ee6cf5-c041-47cf-aace-ccc53c7d2092","Type":"ContainerStarted","Data":"bfc485566a236f6ef73e7c095c103b023b4a78c4c9b57e8035394ff2a4ca0c8a"} Oct 06 08:38:27 crc kubenswrapper[4991]: I1006 08:38:27.066726 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1358-account-create-tthq2" event={"ID":"d5ee6cf5-c041-47cf-aace-ccc53c7d2092","Type":"ContainerStarted","Data":"10e8d9d73b8414f4f9704b7dd6e02b9467584b7cba8100f46248991b61d00246"} Oct 06 08:38:27 crc kubenswrapper[4991]: I1006 08:38:27.072092 4991 generic.go:334] "Generic (PLEG): container finished" podID="53c6aca4-4fd0-4d42-bbe2-4b6e91643503" containerID="30d12fe2a09790653c0ec3185c8f8c2cd238090b351db2e10e53d51862d3fb5f" exitCode=0 Oct 06 08:38:27 crc kubenswrapper[4991]: I1006 08:38:27.072372 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53c6aca4-4fd0-4d42-bbe2-4b6e91643503","Type":"ContainerDied","Data":"30d12fe2a09790653c0ec3185c8f8c2cd238090b351db2e10e53d51862d3fb5f"} Oct 06 08:38:27 crc kubenswrapper[4991]: I1006 08:38:27.087604 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d82d-account-create-27rsm" podStartSLOduration=2.087576659 podStartE2EDuration="2.087576659s" podCreationTimestamp="2025-10-06 08:38:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:27.076907639 +0000 UTC m=+1158.814657670" watchObservedRunningTime="2025-10-06 08:38:27.087576659 +0000 UTC m=+1158.825326680" Oct 06 08:38:27 crc kubenswrapper[4991]: I1006 08:38:27.098806 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-1358-account-create-tthq2" podStartSLOduration=2.0987880150000002 podStartE2EDuration="2.098788015s" podCreationTimestamp="2025-10-06 08:38:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:27.094755392 +0000 UTC m=+1158.832505413" watchObservedRunningTime="2025-10-06 08:38:27.098788015 +0000 UTC m=+1158.836538036" Oct 06 08:38:27 crc kubenswrapper[4991]: I1006 08:38:27.188880 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-8h5bh"] Oct 06 08:38:27 crc kubenswrapper[4991]: I1006 08:38:27.193967 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-jklxx" Oct 06 08:38:27 crc kubenswrapper[4991]: I1006 08:38:27.255661 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c490a3-7c93-4c6c-bc83-73c4997b6492" path="/var/lib/kubelet/pods/a3c490a3-7c93-4c6c-bc83-73c4997b6492/volumes" Oct 06 08:38:28 crc kubenswrapper[4991]: I1006 08:38:28.081679 4991 generic.go:334] "Generic (PLEG): container finished" podID="cb178ee5-a91d-4778-96f8-03cac37c55f5" containerID="da83e83e66021684e9dadbeb740e9d7bfc886bf1d746ab2487aacf93a55a5577" exitCode=0 Oct 06 08:38:28 crc kubenswrapper[4991]: I1006 08:38:28.081744 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d82d-account-create-27rsm" event={"ID":"cb178ee5-a91d-4778-96f8-03cac37c55f5","Type":"ContainerDied","Data":"da83e83e66021684e9dadbeb740e9d7bfc886bf1d746ab2487aacf93a55a5577"} Oct 06 08:38:28 crc kubenswrapper[4991]: I1006 08:38:28.085136 4991 generic.go:334] "Generic (PLEG): container finished" podID="d5ee6cf5-c041-47cf-aace-ccc53c7d2092" containerID="bfc485566a236f6ef73e7c095c103b023b4a78c4c9b57e8035394ff2a4ca0c8a" exitCode=0 Oct 06 08:38:28 crc kubenswrapper[4991]: I1006 08:38:28.085255 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1358-account-create-tthq2" event={"ID":"d5ee6cf5-c041-47cf-aace-ccc53c7d2092","Type":"ContainerDied","Data":"bfc485566a236f6ef73e7c095c103b023b4a78c4c9b57e8035394ff2a4ca0c8a"} Oct 06 08:38:28 crc kubenswrapper[4991]: I1006 08:38:28.086579 4991 generic.go:334] "Generic (PLEG): container finished" podID="ddd94528-deb5-46b2-b5c5-5aba9b33b05d" containerID="8acc746533318f731fb98e18db082b73437f09cba726085cb1598e3c4dc70e47" exitCode=0 Oct 06 08:38:28 crc kubenswrapper[4991]: I1006 08:38:28.086630 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" event={"ID":"ddd94528-deb5-46b2-b5c5-5aba9b33b05d","Type":"ContainerDied","Data":"8acc746533318f731fb98e18db082b73437f09cba726085cb1598e3c4dc70e47"} Oct 06 08:38:28 crc kubenswrapper[4991]: I1006 08:38:28.086685 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" event={"ID":"ddd94528-deb5-46b2-b5c5-5aba9b33b05d","Type":"ContainerStarted","Data":"deb88501fdc75460d6f35d73bdded82846b9f946a33769d4569d0c000439c9d6"} Oct 06 08:38:28 crc kubenswrapper[4991]: I1006 08:38:28.088333 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53c6aca4-4fd0-4d42-bbe2-4b6e91643503","Type":"ContainerStarted","Data":"304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c"} Oct 06 08:38:28 crc kubenswrapper[4991]: I1006 08:38:28.088539 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 08:38:28 crc kubenswrapper[4991]: I1006 08:38:28.168597 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371969.686201 podStartE2EDuration="1m7.16857487s" podCreationTimestamp="2025-10-06 08:37:21 +0000 UTC" firstStartedPulling="2025-10-06 08:37:23.258203942 +0000 UTC m=+1094.995953963" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:28.160043649 +0000 UTC m=+1159.897793670" watchObservedRunningTime="2025-10-06 08:38:28.16857487 +0000 UTC m=+1159.906324891" Oct 06 08:38:29 crc kubenswrapper[4991]: I1006 08:38:29.098025 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" event={"ID":"ddd94528-deb5-46b2-b5c5-5aba9b33b05d","Type":"ContainerStarted","Data":"ace95d799694a982e2861042f2798315a25cc282e7a43165e328ad10f02f4aa7"} Oct 06 08:38:29 crc kubenswrapper[4991]: I1006 08:38:29.099032 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:29 crc kubenswrapper[4991]: I1006 08:38:29.117770 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" podStartSLOduration=3.117754974 podStartE2EDuration="3.117754974s" podCreationTimestamp="2025-10-06 08:38:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:29.115815069 +0000 UTC m=+1160.853565080" watchObservedRunningTime="2025-10-06 08:38:29.117754974 +0000 UTC m=+1160.855504985" Oct 06 08:38:36 crc kubenswrapper[4991]: I1006 08:38:36.714671 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:38:36 crc kubenswrapper[4991]: I1006 08:38:36.835144 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-7mv7x"] Oct 06 08:38:36 crc kubenswrapper[4991]: I1006 08:38:36.836755 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" podUID="240e42d0-244a-4222-9c12-131cb5ab3be6" containerName="dnsmasq-dns" containerID="cri-o://ccbb42e55502e4eee5bc56b5016c4468ab2acf943f34b02551c08340b32bb150" gracePeriod=10 Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.171916 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1358-account-create-tthq2" event={"ID":"d5ee6cf5-c041-47cf-aace-ccc53c7d2092","Type":"ContainerDied","Data":"10e8d9d73b8414f4f9704b7dd6e02b9467584b7cba8100f46248991b61d00246"} Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.172196 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e8d9d73b8414f4f9704b7dd6e02b9467584b7cba8100f46248991b61d00246" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.175575 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d82d-account-create-27rsm" event={"ID":"cb178ee5-a91d-4778-96f8-03cac37c55f5","Type":"ContainerDied","Data":"2df5937bec6c1e80c71cb889cdc2ab379d589dad485ad66c036af8977c21fa38"} Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.175627 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2df5937bec6c1e80c71cb889cdc2ab379d589dad485ad66c036af8977c21fa38" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.177537 4991 generic.go:334] "Generic (PLEG): container finished" podID="240e42d0-244a-4222-9c12-131cb5ab3be6" containerID="ccbb42e55502e4eee5bc56b5016c4468ab2acf943f34b02551c08340b32bb150" exitCode=0 Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.177563 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" event={"ID":"240e42d0-244a-4222-9c12-131cb5ab3be6","Type":"ContainerDied","Data":"ccbb42e55502e4eee5bc56b5016c4468ab2acf943f34b02551c08340b32bb150"} Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.179646 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d82d-account-create-27rsm" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.197391 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1358-account-create-tthq2" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.366664 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.367736 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svd6h\" (UniqueName: \"kubernetes.io/projected/cb178ee5-a91d-4778-96f8-03cac37c55f5-kube-api-access-svd6h\") pod \"cb178ee5-a91d-4778-96f8-03cac37c55f5\" (UID: \"cb178ee5-a91d-4778-96f8-03cac37c55f5\") " Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.367903 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwzj2\" (UniqueName: \"kubernetes.io/projected/d5ee6cf5-c041-47cf-aace-ccc53c7d2092-kube-api-access-mwzj2\") pod \"d5ee6cf5-c041-47cf-aace-ccc53c7d2092\" (UID: \"d5ee6cf5-c041-47cf-aace-ccc53c7d2092\") " Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.372193 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ee6cf5-c041-47cf-aace-ccc53c7d2092-kube-api-access-mwzj2" (OuterVolumeSpecName: "kube-api-access-mwzj2") pod "d5ee6cf5-c041-47cf-aace-ccc53c7d2092" (UID: "d5ee6cf5-c041-47cf-aace-ccc53c7d2092"). InnerVolumeSpecName "kube-api-access-mwzj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.373406 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb178ee5-a91d-4778-96f8-03cac37c55f5-kube-api-access-svd6h" (OuterVolumeSpecName: "kube-api-access-svd6h") pod "cb178ee5-a91d-4778-96f8-03cac37c55f5" (UID: "cb178ee5-a91d-4778-96f8-03cac37c55f5"). InnerVolumeSpecName "kube-api-access-svd6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.469662 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-config\") pod \"240e42d0-244a-4222-9c12-131cb5ab3be6\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.469740 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wshm\" (UniqueName: \"kubernetes.io/projected/240e42d0-244a-4222-9c12-131cb5ab3be6-kube-api-access-6wshm\") pod \"240e42d0-244a-4222-9c12-131cb5ab3be6\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.469832 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-dns-svc\") pod \"240e42d0-244a-4222-9c12-131cb5ab3be6\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.469892 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-ovsdbserver-sb\") pod \"240e42d0-244a-4222-9c12-131cb5ab3be6\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.469914 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-ovsdbserver-nb\") pod \"240e42d0-244a-4222-9c12-131cb5ab3be6\" (UID: \"240e42d0-244a-4222-9c12-131cb5ab3be6\") " Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.470270 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwzj2\" (UniqueName: \"kubernetes.io/projected/d5ee6cf5-c041-47cf-aace-ccc53c7d2092-kube-api-access-mwzj2\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.470286 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svd6h\" (UniqueName: \"kubernetes.io/projected/cb178ee5-a91d-4778-96f8-03cac37c55f5-kube-api-access-svd6h\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.473140 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/240e42d0-244a-4222-9c12-131cb5ab3be6-kube-api-access-6wshm" (OuterVolumeSpecName: "kube-api-access-6wshm") pod "240e42d0-244a-4222-9c12-131cb5ab3be6" (UID: "240e42d0-244a-4222-9c12-131cb5ab3be6"). InnerVolumeSpecName "kube-api-access-6wshm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.506331 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-config" (OuterVolumeSpecName: "config") pod "240e42d0-244a-4222-9c12-131cb5ab3be6" (UID: "240e42d0-244a-4222-9c12-131cb5ab3be6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.510400 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "240e42d0-244a-4222-9c12-131cb5ab3be6" (UID: "240e42d0-244a-4222-9c12-131cb5ab3be6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.515617 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "240e42d0-244a-4222-9c12-131cb5ab3be6" (UID: "240e42d0-244a-4222-9c12-131cb5ab3be6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.517469 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "240e42d0-244a-4222-9c12-131cb5ab3be6" (UID: "240e42d0-244a-4222-9c12-131cb5ab3be6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.572903 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.572937 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wshm\" (UniqueName: \"kubernetes.io/projected/240e42d0-244a-4222-9c12-131cb5ab3be6-kube-api-access-6wshm\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.572948 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.572958 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:37 crc kubenswrapper[4991]: I1006 08:38:37.572966 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/240e42d0-244a-4222-9c12-131cb5ab3be6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:38 crc kubenswrapper[4991]: I1006 08:38:38.186255 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" event={"ID":"240e42d0-244a-4222-9c12-131cb5ab3be6","Type":"ContainerDied","Data":"1f30410270acfb60d074f9f1d955ca744f2f24a67e9db55a258a818e5b1a2f87"} Oct 06 08:38:38 crc kubenswrapper[4991]: I1006 08:38:38.187087 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-7mv7x" Oct 06 08:38:38 crc kubenswrapper[4991]: I1006 08:38:38.187464 4991 scope.go:117] "RemoveContainer" containerID="ccbb42e55502e4eee5bc56b5016c4468ab2acf943f34b02551c08340b32bb150" Oct 06 08:38:38 crc kubenswrapper[4991]: I1006 08:38:38.194908 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d82d-account-create-27rsm" Oct 06 08:38:38 crc kubenswrapper[4991]: I1006 08:38:38.195571 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b5jwb" event={"ID":"c473952b-d738-4c47-a5e2-c6f827ff4730","Type":"ContainerStarted","Data":"80d2f6a1ef6afbd1ba9965b2005ac33f9bb76351dfedd91967c680c8672c4df2"} Oct 06 08:38:38 crc kubenswrapper[4991]: I1006 08:38:38.195607 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1358-account-create-tthq2" Oct 06 08:38:38 crc kubenswrapper[4991]: I1006 08:38:38.221685 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-b5jwb" podStartSLOduration=2.437388267 podStartE2EDuration="17.221661313s" podCreationTimestamp="2025-10-06 08:38:21 +0000 UTC" firstStartedPulling="2025-10-06 08:38:22.458898517 +0000 UTC m=+1154.196648538" lastFinishedPulling="2025-10-06 08:38:37.243171563 +0000 UTC m=+1168.980921584" observedRunningTime="2025-10-06 08:38:38.215938332 +0000 UTC m=+1169.953688353" watchObservedRunningTime="2025-10-06 08:38:38.221661313 +0000 UTC m=+1169.959411334" Oct 06 08:38:38 crc kubenswrapper[4991]: I1006 08:38:38.232913 4991 scope.go:117] "RemoveContainer" containerID="fc7bacd93ce7271acf251424e765f158d74e9ba61a9ebf9aee437f2e3ae4d07c" Oct 06 08:38:38 crc kubenswrapper[4991]: I1006 08:38:38.241550 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-7mv7x"] Oct 06 08:38:38 crc kubenswrapper[4991]: I1006 08:38:38.248952 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-7mv7x"] Oct 06 08:38:39 crc kubenswrapper[4991]: I1006 08:38:39.258583 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="240e42d0-244a-4222-9c12-131cb5ab3be6" path="/var/lib/kubelet/pods/240e42d0-244a-4222-9c12-131cb5ab3be6/volumes" Oct 06 08:38:42 crc kubenswrapper[4991]: I1006 08:38:42.750556 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.018874 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-tqm4c"] Oct 06 08:38:43 crc kubenswrapper[4991]: E1006 08:38:43.019529 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb178ee5-a91d-4778-96f8-03cac37c55f5" containerName="mariadb-account-create" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.019551 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb178ee5-a91d-4778-96f8-03cac37c55f5" containerName="mariadb-account-create" Oct 06 08:38:43 crc kubenswrapper[4991]: E1006 08:38:43.019563 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240e42d0-244a-4222-9c12-131cb5ab3be6" containerName="init" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.019569 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="240e42d0-244a-4222-9c12-131cb5ab3be6" containerName="init" Oct 06 08:38:43 crc kubenswrapper[4991]: E1006 08:38:43.019590 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240e42d0-244a-4222-9c12-131cb5ab3be6" containerName="dnsmasq-dns" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.019598 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="240e42d0-244a-4222-9c12-131cb5ab3be6" containerName="dnsmasq-dns" Oct 06 08:38:43 crc kubenswrapper[4991]: E1006 08:38:43.019608 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ee6cf5-c041-47cf-aace-ccc53c7d2092" containerName="mariadb-account-create" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.019615 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ee6cf5-c041-47cf-aace-ccc53c7d2092" containerName="mariadb-account-create" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.019791 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb178ee5-a91d-4778-96f8-03cac37c55f5" containerName="mariadb-account-create" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.019812 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="240e42d0-244a-4222-9c12-131cb5ab3be6" containerName="dnsmasq-dns" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.019854 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ee6cf5-c041-47cf-aace-ccc53c7d2092" containerName="mariadb-account-create" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.025326 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tqm4c" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.042484 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.045060 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tqm4c"] Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.124494 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-9t6rn"] Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.125455 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9t6rn" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.145664 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9t6rn"] Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.162696 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjrj2\" (UniqueName: \"kubernetes.io/projected/b61822bc-f709-47be-b2ed-71284622cbe1-kube-api-access-gjrj2\") pod \"cinder-db-create-tqm4c\" (UID: \"b61822bc-f709-47be-b2ed-71284622cbe1\") " pod="openstack/cinder-db-create-tqm4c" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.264612 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjrj2\" (UniqueName: \"kubernetes.io/projected/b61822bc-f709-47be-b2ed-71284622cbe1-kube-api-access-gjrj2\") pod \"cinder-db-create-tqm4c\" (UID: \"b61822bc-f709-47be-b2ed-71284622cbe1\") " pod="openstack/cinder-db-create-tqm4c" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.264730 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p9r5\" (UniqueName: \"kubernetes.io/projected/1f477921-a357-4895-bad9-8489244afd27-kube-api-access-7p9r5\") pod \"barbican-db-create-9t6rn\" (UID: \"1f477921-a357-4895-bad9-8489244afd27\") " pod="openstack/barbican-db-create-9t6rn" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.286844 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjrj2\" (UniqueName: \"kubernetes.io/projected/b61822bc-f709-47be-b2ed-71284622cbe1-kube-api-access-gjrj2\") pod \"cinder-db-create-tqm4c\" (UID: \"b61822bc-f709-47be-b2ed-71284622cbe1\") " pod="openstack/cinder-db-create-tqm4c" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.329022 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-92hrh"] Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.330426 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-92hrh" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.337135 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-92hrh"] Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.343927 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tqm4c" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.370989 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p9r5\" (UniqueName: \"kubernetes.io/projected/1f477921-a357-4895-bad9-8489244afd27-kube-api-access-7p9r5\") pod \"barbican-db-create-9t6rn\" (UID: \"1f477921-a357-4895-bad9-8489244afd27\") " pod="openstack/barbican-db-create-9t6rn" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.390756 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p9r5\" (UniqueName: \"kubernetes.io/projected/1f477921-a357-4895-bad9-8489244afd27-kube-api-access-7p9r5\") pod \"barbican-db-create-9t6rn\" (UID: \"1f477921-a357-4895-bad9-8489244afd27\") " pod="openstack/barbican-db-create-9t6rn" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.405382 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vwxnk"] Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.406391 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.415704 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.416100 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-45scd" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.416489 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.416582 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.421596 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vwxnk"] Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.439183 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9t6rn" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.479137 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd8qz\" (UniqueName: \"kubernetes.io/projected/d6625a18-265d-4c50-8841-f36e4f59d79f-kube-api-access-wd8qz\") pod \"neutron-db-create-92hrh\" (UID: \"d6625a18-265d-4c50-8841-f36e4f59d79f\") " pod="openstack/neutron-db-create-92hrh" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.582890 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhmwq\" (UniqueName: \"kubernetes.io/projected/2f3aef9d-9026-440f-a163-c1caaefb69a3-kube-api-access-bhmwq\") pod \"keystone-db-sync-vwxnk\" (UID: \"2f3aef9d-9026-440f-a163-c1caaefb69a3\") " pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.582978 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3aef9d-9026-440f-a163-c1caaefb69a3-combined-ca-bundle\") pod \"keystone-db-sync-vwxnk\" (UID: \"2f3aef9d-9026-440f-a163-c1caaefb69a3\") " pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.583024 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f3aef9d-9026-440f-a163-c1caaefb69a3-config-data\") pod \"keystone-db-sync-vwxnk\" (UID: \"2f3aef9d-9026-440f-a163-c1caaefb69a3\") " pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.583088 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd8qz\" (UniqueName: \"kubernetes.io/projected/d6625a18-265d-4c50-8841-f36e4f59d79f-kube-api-access-wd8qz\") pod \"neutron-db-create-92hrh\" (UID: \"d6625a18-265d-4c50-8841-f36e4f59d79f\") " pod="openstack/neutron-db-create-92hrh" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.607030 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd8qz\" (UniqueName: \"kubernetes.io/projected/d6625a18-265d-4c50-8841-f36e4f59d79f-kube-api-access-wd8qz\") pod \"neutron-db-create-92hrh\" (UID: \"d6625a18-265d-4c50-8841-f36e4f59d79f\") " pod="openstack/neutron-db-create-92hrh" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.646761 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-92hrh" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.684217 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhmwq\" (UniqueName: \"kubernetes.io/projected/2f3aef9d-9026-440f-a163-c1caaefb69a3-kube-api-access-bhmwq\") pod \"keystone-db-sync-vwxnk\" (UID: \"2f3aef9d-9026-440f-a163-c1caaefb69a3\") " pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.684323 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3aef9d-9026-440f-a163-c1caaefb69a3-combined-ca-bundle\") pod \"keystone-db-sync-vwxnk\" (UID: \"2f3aef9d-9026-440f-a163-c1caaefb69a3\") " pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.684352 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f3aef9d-9026-440f-a163-c1caaefb69a3-config-data\") pod \"keystone-db-sync-vwxnk\" (UID: \"2f3aef9d-9026-440f-a163-c1caaefb69a3\") " pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.688061 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f3aef9d-9026-440f-a163-c1caaefb69a3-config-data\") pod \"keystone-db-sync-vwxnk\" (UID: \"2f3aef9d-9026-440f-a163-c1caaefb69a3\") " pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.696060 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3aef9d-9026-440f-a163-c1caaefb69a3-combined-ca-bundle\") pod \"keystone-db-sync-vwxnk\" (UID: \"2f3aef9d-9026-440f-a163-c1caaefb69a3\") " pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.722507 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhmwq\" (UniqueName: \"kubernetes.io/projected/2f3aef9d-9026-440f-a163-c1caaefb69a3-kube-api-access-bhmwq\") pod \"keystone-db-sync-vwxnk\" (UID: \"2f3aef9d-9026-440f-a163-c1caaefb69a3\") " pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.748049 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.788464 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9t6rn"] Oct 06 08:38:43 crc kubenswrapper[4991]: W1006 08:38:43.814978 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f477921_a357_4895_bad9_8489244afd27.slice/crio-a9fd49612e1619fd57d964724e478dfd6d41db39869b541006edbe11616923f4 WatchSource:0}: Error finding container a9fd49612e1619fd57d964724e478dfd6d41db39869b541006edbe11616923f4: Status 404 returned error can't find the container with id a9fd49612e1619fd57d964724e478dfd6d41db39869b541006edbe11616923f4 Oct 06 08:38:43 crc kubenswrapper[4991]: I1006 08:38:43.967013 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tqm4c"] Oct 06 08:38:44 crc kubenswrapper[4991]: I1006 08:38:44.253078 4991 generic.go:334] "Generic (PLEG): container finished" podID="b61822bc-f709-47be-b2ed-71284622cbe1" containerID="230ecef54a8d71e96735b7f37e4c28c5d6b96546bba37975613dfcde0832897f" exitCode=0 Oct 06 08:38:44 crc kubenswrapper[4991]: I1006 08:38:44.253135 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tqm4c" event={"ID":"b61822bc-f709-47be-b2ed-71284622cbe1","Type":"ContainerDied","Data":"230ecef54a8d71e96735b7f37e4c28c5d6b96546bba37975613dfcde0832897f"} Oct 06 08:38:44 crc kubenswrapper[4991]: I1006 08:38:44.253161 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tqm4c" event={"ID":"b61822bc-f709-47be-b2ed-71284622cbe1","Type":"ContainerStarted","Data":"e888d4c269981ac45c23bdb753c8a58a5db2529bfd6b04e61355c5c94e54faa1"} Oct 06 08:38:44 crc kubenswrapper[4991]: I1006 08:38:44.255072 4991 generic.go:334] "Generic (PLEG): container finished" podID="1f477921-a357-4895-bad9-8489244afd27" containerID="9d1e65fa883ba5cffd5e95aa22ba7e682849355dc55829a97f9c2ae259c034c9" exitCode=0 Oct 06 08:38:44 crc kubenswrapper[4991]: I1006 08:38:44.255093 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9t6rn" event={"ID":"1f477921-a357-4895-bad9-8489244afd27","Type":"ContainerDied","Data":"9d1e65fa883ba5cffd5e95aa22ba7e682849355dc55829a97f9c2ae259c034c9"} Oct 06 08:38:44 crc kubenswrapper[4991]: I1006 08:38:44.255107 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9t6rn" event={"ID":"1f477921-a357-4895-bad9-8489244afd27","Type":"ContainerStarted","Data":"a9fd49612e1619fd57d964724e478dfd6d41db39869b541006edbe11616923f4"} Oct 06 08:38:44 crc kubenswrapper[4991]: I1006 08:38:44.320145 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vwxnk"] Oct 06 08:38:44 crc kubenswrapper[4991]: I1006 08:38:44.335691 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-92hrh"] Oct 06 08:38:44 crc kubenswrapper[4991]: W1006 08:38:44.338728 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6625a18_265d_4c50_8841_f36e4f59d79f.slice/crio-cf8f31abd48fef35d73547d40e267cc75e033e033148897ffa28ce7951f84de2 WatchSource:0}: Error finding container cf8f31abd48fef35d73547d40e267cc75e033e033148897ffa28ce7951f84de2: Status 404 returned error can't find the container with id cf8f31abd48fef35d73547d40e267cc75e033e033148897ffa28ce7951f84de2 Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.276991 4991 generic.go:334] "Generic (PLEG): container finished" podID="d6625a18-265d-4c50-8841-f36e4f59d79f" containerID="f39f809086eb2bd23e1a09e4b6df9642065b898b5270ec07f191210540900b73" exitCode=0 Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.277165 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-92hrh" event={"ID":"d6625a18-265d-4c50-8841-f36e4f59d79f","Type":"ContainerDied","Data":"f39f809086eb2bd23e1a09e4b6df9642065b898b5270ec07f191210540900b73"} Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.277413 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-92hrh" event={"ID":"d6625a18-265d-4c50-8841-f36e4f59d79f","Type":"ContainerStarted","Data":"cf8f31abd48fef35d73547d40e267cc75e033e033148897ffa28ce7951f84de2"} Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.286153 4991 generic.go:334] "Generic (PLEG): container finished" podID="c473952b-d738-4c47-a5e2-c6f827ff4730" containerID="80d2f6a1ef6afbd1ba9965b2005ac33f9bb76351dfedd91967c680c8672c4df2" exitCode=0 Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.286273 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b5jwb" event={"ID":"c473952b-d738-4c47-a5e2-c6f827ff4730","Type":"ContainerDied","Data":"80d2f6a1ef6afbd1ba9965b2005ac33f9bb76351dfedd91967c680c8672c4df2"} Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.287874 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vwxnk" event={"ID":"2f3aef9d-9026-440f-a163-c1caaefb69a3","Type":"ContainerStarted","Data":"461082205d2af56b63eb25ded2ea859b4fe4f0fea2cefef1fe93205c74439a41"} Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.694518 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tqm4c" Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.699974 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9t6rn" Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.825145 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjrj2\" (UniqueName: \"kubernetes.io/projected/b61822bc-f709-47be-b2ed-71284622cbe1-kube-api-access-gjrj2\") pod \"b61822bc-f709-47be-b2ed-71284622cbe1\" (UID: \"b61822bc-f709-47be-b2ed-71284622cbe1\") " Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.825351 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p9r5\" (UniqueName: \"kubernetes.io/projected/1f477921-a357-4895-bad9-8489244afd27-kube-api-access-7p9r5\") pod \"1f477921-a357-4895-bad9-8489244afd27\" (UID: \"1f477921-a357-4895-bad9-8489244afd27\") " Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.841479 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f477921-a357-4895-bad9-8489244afd27-kube-api-access-7p9r5" (OuterVolumeSpecName: "kube-api-access-7p9r5") pod "1f477921-a357-4895-bad9-8489244afd27" (UID: "1f477921-a357-4895-bad9-8489244afd27"). InnerVolumeSpecName "kube-api-access-7p9r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.841820 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61822bc-f709-47be-b2ed-71284622cbe1-kube-api-access-gjrj2" (OuterVolumeSpecName: "kube-api-access-gjrj2") pod "b61822bc-f709-47be-b2ed-71284622cbe1" (UID: "b61822bc-f709-47be-b2ed-71284622cbe1"). InnerVolumeSpecName "kube-api-access-gjrj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.930031 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p9r5\" (UniqueName: \"kubernetes.io/projected/1f477921-a357-4895-bad9-8489244afd27-kube-api-access-7p9r5\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:45 crc kubenswrapper[4991]: I1006 08:38:45.930062 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjrj2\" (UniqueName: \"kubernetes.io/projected/b61822bc-f709-47be-b2ed-71284622cbe1-kube-api-access-gjrj2\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:46 crc kubenswrapper[4991]: I1006 08:38:46.297590 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9t6rn" event={"ID":"1f477921-a357-4895-bad9-8489244afd27","Type":"ContainerDied","Data":"a9fd49612e1619fd57d964724e478dfd6d41db39869b541006edbe11616923f4"} Oct 06 08:38:46 crc kubenswrapper[4991]: I1006 08:38:46.297904 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9fd49612e1619fd57d964724e478dfd6d41db39869b541006edbe11616923f4" Oct 06 08:38:46 crc kubenswrapper[4991]: I1006 08:38:46.297672 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9t6rn" Oct 06 08:38:46 crc kubenswrapper[4991]: I1006 08:38:46.299663 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tqm4c" event={"ID":"b61822bc-f709-47be-b2ed-71284622cbe1","Type":"ContainerDied","Data":"e888d4c269981ac45c23bdb753c8a58a5db2529bfd6b04e61355c5c94e54faa1"} Oct 06 08:38:46 crc kubenswrapper[4991]: I1006 08:38:46.299721 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e888d4c269981ac45c23bdb753c8a58a5db2529bfd6b04e61355c5c94e54faa1" Oct 06 08:38:46 crc kubenswrapper[4991]: I1006 08:38:46.299866 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tqm4c" Oct 06 08:38:48 crc kubenswrapper[4991]: I1006 08:38:48.868126 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-92hrh" Oct 06 08:38:48 crc kubenswrapper[4991]: I1006 08:38:48.909956 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:48 crc kubenswrapper[4991]: I1006 08:38:48.979961 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd8qz\" (UniqueName: \"kubernetes.io/projected/d6625a18-265d-4c50-8841-f36e4f59d79f-kube-api-access-wd8qz\") pod \"d6625a18-265d-4c50-8841-f36e4f59d79f\" (UID: \"d6625a18-265d-4c50-8841-f36e4f59d79f\") " Oct 06 08:38:48 crc kubenswrapper[4991]: I1006 08:38:48.983318 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6625a18-265d-4c50-8841-f36e4f59d79f-kube-api-access-wd8qz" (OuterVolumeSpecName: "kube-api-access-wd8qz") pod "d6625a18-265d-4c50-8841-f36e4f59d79f" (UID: "d6625a18-265d-4c50-8841-f36e4f59d79f"). InnerVolumeSpecName "kube-api-access-wd8qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.081916 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-db-sync-config-data\") pod \"c473952b-d738-4c47-a5e2-c6f827ff4730\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.082018 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-config-data\") pod \"c473952b-d738-4c47-a5e2-c6f827ff4730\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.082083 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg56h\" (UniqueName: \"kubernetes.io/projected/c473952b-d738-4c47-a5e2-c6f827ff4730-kube-api-access-lg56h\") pod \"c473952b-d738-4c47-a5e2-c6f827ff4730\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.082145 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-combined-ca-bundle\") pod \"c473952b-d738-4c47-a5e2-c6f827ff4730\" (UID: \"c473952b-d738-4c47-a5e2-c6f827ff4730\") " Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.082652 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd8qz\" (UniqueName: \"kubernetes.io/projected/d6625a18-265d-4c50-8841-f36e4f59d79f-kube-api-access-wd8qz\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.086655 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c473952b-d738-4c47-a5e2-c6f827ff4730-kube-api-access-lg56h" (OuterVolumeSpecName: "kube-api-access-lg56h") pod "c473952b-d738-4c47-a5e2-c6f827ff4730" (UID: "c473952b-d738-4c47-a5e2-c6f827ff4730"). InnerVolumeSpecName "kube-api-access-lg56h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.087365 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c473952b-d738-4c47-a5e2-c6f827ff4730" (UID: "c473952b-d738-4c47-a5e2-c6f827ff4730"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.106773 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c473952b-d738-4c47-a5e2-c6f827ff4730" (UID: "c473952b-d738-4c47-a5e2-c6f827ff4730"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.133862 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-config-data" (OuterVolumeSpecName: "config-data") pod "c473952b-d738-4c47-a5e2-c6f827ff4730" (UID: "c473952b-d738-4c47-a5e2-c6f827ff4730"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.183914 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg56h\" (UniqueName: \"kubernetes.io/projected/c473952b-d738-4c47-a5e2-c6f827ff4730-kube-api-access-lg56h\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.183951 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.183961 4991 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.183970 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c473952b-d738-4c47-a5e2-c6f827ff4730-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.330225 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-92hrh" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.330252 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-92hrh" event={"ID":"d6625a18-265d-4c50-8841-f36e4f59d79f","Type":"ContainerDied","Data":"cf8f31abd48fef35d73547d40e267cc75e033e033148897ffa28ce7951f84de2"} Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.330504 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf8f31abd48fef35d73547d40e267cc75e033e033148897ffa28ce7951f84de2" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.332859 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b5jwb" event={"ID":"c473952b-d738-4c47-a5e2-c6f827ff4730","Type":"ContainerDied","Data":"8dd6cfd82c5aa3dc5e2d1990beb9ad6107aeba30e1f5b77e5e9ba10f8dde60d6"} Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.332911 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dd6cfd82c5aa3dc5e2d1990beb9ad6107aeba30e1f5b77e5e9ba10f8dde60d6" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.332918 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b5jwb" Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.335569 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vwxnk" event={"ID":"2f3aef9d-9026-440f-a163-c1caaefb69a3","Type":"ContainerStarted","Data":"1cbd110e01dc7118014251c8877f2413d8ad43399e486f3327dd5f1ac11596d2"} Oct 06 08:38:49 crc kubenswrapper[4991]: I1006 08:38:49.906094 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vwxnk" podStartSLOduration=2.493799372 podStartE2EDuration="6.906072873s" podCreationTimestamp="2025-10-06 08:38:43 +0000 UTC" firstStartedPulling="2025-10-06 08:38:44.340351269 +0000 UTC m=+1176.078101290" lastFinishedPulling="2025-10-06 08:38:48.75262478 +0000 UTC m=+1180.490374791" observedRunningTime="2025-10-06 08:38:49.375030269 +0000 UTC m=+1181.112780370" watchObservedRunningTime="2025-10-06 08:38:49.906072873 +0000 UTC m=+1181.643822914" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.403616 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-nzwtn"] Oct 06 08:38:50 crc kubenswrapper[4991]: E1006 08:38:50.404702 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61822bc-f709-47be-b2ed-71284622cbe1" containerName="mariadb-database-create" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.404727 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61822bc-f709-47be-b2ed-71284622cbe1" containerName="mariadb-database-create" Oct 06 08:38:50 crc kubenswrapper[4991]: E1006 08:38:50.404749 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f477921-a357-4895-bad9-8489244afd27" containerName="mariadb-database-create" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.404756 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f477921-a357-4895-bad9-8489244afd27" containerName="mariadb-database-create" Oct 06 08:38:50 crc kubenswrapper[4991]: E1006 08:38:50.404776 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6625a18-265d-4c50-8841-f36e4f59d79f" containerName="mariadb-database-create" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.404781 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6625a18-265d-4c50-8841-f36e4f59d79f" containerName="mariadb-database-create" Oct 06 08:38:50 crc kubenswrapper[4991]: E1006 08:38:50.404796 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c473952b-d738-4c47-a5e2-c6f827ff4730" containerName="glance-db-sync" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.404804 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c473952b-d738-4c47-a5e2-c6f827ff4730" containerName="glance-db-sync" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.404976 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61822bc-f709-47be-b2ed-71284622cbe1" containerName="mariadb-database-create" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.404990 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f477921-a357-4895-bad9-8489244afd27" containerName="mariadb-database-create" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.405002 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6625a18-265d-4c50-8841-f36e4f59d79f" containerName="mariadb-database-create" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.405014 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c473952b-d738-4c47-a5e2-c6f827ff4730" containerName="glance-db-sync" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.405905 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.431893 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-nzwtn"] Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.519495 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfk9k\" (UniqueName: \"kubernetes.io/projected/7e421039-325e-471a-9b26-3ef3ed347c77-kube-api-access-gfk9k\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.519616 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-config\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.519651 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.519886 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.519955 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.520084 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-dns-svc\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.621208 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfk9k\" (UniqueName: \"kubernetes.io/projected/7e421039-325e-471a-9b26-3ef3ed347c77-kube-api-access-gfk9k\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.621281 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-config\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.621334 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.621387 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.621415 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.621465 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-dns-svc\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.622346 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-dns-svc\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.622689 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.622986 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.623357 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.626053 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-config\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.641074 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfk9k\" (UniqueName: \"kubernetes.io/projected/7e421039-325e-471a-9b26-3ef3ed347c77-kube-api-access-gfk9k\") pod \"dnsmasq-dns-895cf5cf-nzwtn\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:50 crc kubenswrapper[4991]: I1006 08:38:50.721984 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:51 crc kubenswrapper[4991]: I1006 08:38:51.179175 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-nzwtn"] Oct 06 08:38:51 crc kubenswrapper[4991]: W1006 08:38:51.195200 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e421039_325e_471a_9b26_3ef3ed347c77.slice/crio-29936ee31624e96f5ecf4a3f72bcc90e163c9acab9b5115c8f3eb6a9fbc56ced WatchSource:0}: Error finding container 29936ee31624e96f5ecf4a3f72bcc90e163c9acab9b5115c8f3eb6a9fbc56ced: Status 404 returned error can't find the container with id 29936ee31624e96f5ecf4a3f72bcc90e163c9acab9b5115c8f3eb6a9fbc56ced Oct 06 08:38:51 crc kubenswrapper[4991]: I1006 08:38:51.363036 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" event={"ID":"7e421039-325e-471a-9b26-3ef3ed347c77","Type":"ContainerStarted","Data":"29936ee31624e96f5ecf4a3f72bcc90e163c9acab9b5115c8f3eb6a9fbc56ced"} Oct 06 08:38:52 crc kubenswrapper[4991]: I1006 08:38:52.374476 4991 generic.go:334] "Generic (PLEG): container finished" podID="2f3aef9d-9026-440f-a163-c1caaefb69a3" containerID="1cbd110e01dc7118014251c8877f2413d8ad43399e486f3327dd5f1ac11596d2" exitCode=0 Oct 06 08:38:52 crc kubenswrapper[4991]: I1006 08:38:52.374554 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vwxnk" event={"ID":"2f3aef9d-9026-440f-a163-c1caaefb69a3","Type":"ContainerDied","Data":"1cbd110e01dc7118014251c8877f2413d8ad43399e486f3327dd5f1ac11596d2"} Oct 06 08:38:52 crc kubenswrapper[4991]: I1006 08:38:52.384367 4991 generic.go:334] "Generic (PLEG): container finished" podID="7e421039-325e-471a-9b26-3ef3ed347c77" containerID="9eb151f0553e7eeeb97a929db843bc43c8fe5eb5225732611708a5c655720921" exitCode=0 Oct 06 08:38:52 crc kubenswrapper[4991]: I1006 08:38:52.384423 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" event={"ID":"7e421039-325e-471a-9b26-3ef3ed347c77","Type":"ContainerDied","Data":"9eb151f0553e7eeeb97a929db843bc43c8fe5eb5225732611708a5c655720921"} Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.118584 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-cd31-account-create-pmq92"] Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.120441 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cd31-account-create-pmq92" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.122971 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.129196 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cd31-account-create-pmq92"] Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.268154 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lsnn\" (UniqueName: \"kubernetes.io/projected/5c3d0945-d5b0-43bc-9ecf-0c0023ba2566-kube-api-access-9lsnn\") pod \"barbican-cd31-account-create-pmq92\" (UID: \"5c3d0945-d5b0-43bc-9ecf-0c0023ba2566\") " pod="openstack/barbican-cd31-account-create-pmq92" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.330239 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b589-account-create-k2shf"] Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.331526 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b589-account-create-k2shf" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.333924 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.340787 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b589-account-create-k2shf"] Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.371518 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lsnn\" (UniqueName: \"kubernetes.io/projected/5c3d0945-d5b0-43bc-9ecf-0c0023ba2566-kube-api-access-9lsnn\") pod \"barbican-cd31-account-create-pmq92\" (UID: \"5c3d0945-d5b0-43bc-9ecf-0c0023ba2566\") " pod="openstack/barbican-cd31-account-create-pmq92" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.392245 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lsnn\" (UniqueName: \"kubernetes.io/projected/5c3d0945-d5b0-43bc-9ecf-0c0023ba2566-kube-api-access-9lsnn\") pod \"barbican-cd31-account-create-pmq92\" (UID: \"5c3d0945-d5b0-43bc-9ecf-0c0023ba2566\") " pod="openstack/barbican-cd31-account-create-pmq92" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.396672 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" event={"ID":"7e421039-325e-471a-9b26-3ef3ed347c77","Type":"ContainerStarted","Data":"83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0"} Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.396840 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.425455 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" podStartSLOduration=3.425434547 podStartE2EDuration="3.425434547s" podCreationTimestamp="2025-10-06 08:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:53.417510564 +0000 UTC m=+1185.155260585" watchObservedRunningTime="2025-10-06 08:38:53.425434547 +0000 UTC m=+1185.163184568" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.442320 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cd31-account-create-pmq92" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.473817 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvwq7\" (UniqueName: \"kubernetes.io/projected/d4b6a26f-f25f-401d-a645-e94f9815314c-kube-api-access-dvwq7\") pod \"cinder-b589-account-create-k2shf\" (UID: \"d4b6a26f-f25f-401d-a645-e94f9815314c\") " pod="openstack/cinder-b589-account-create-k2shf" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.575165 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvwq7\" (UniqueName: \"kubernetes.io/projected/d4b6a26f-f25f-401d-a645-e94f9815314c-kube-api-access-dvwq7\") pod \"cinder-b589-account-create-k2shf\" (UID: \"d4b6a26f-f25f-401d-a645-e94f9815314c\") " pod="openstack/cinder-b589-account-create-k2shf" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.601132 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvwq7\" (UniqueName: \"kubernetes.io/projected/d4b6a26f-f25f-401d-a645-e94f9815314c-kube-api-access-dvwq7\") pod \"cinder-b589-account-create-k2shf\" (UID: \"d4b6a26f-f25f-401d-a645-e94f9815314c\") " pod="openstack/cinder-b589-account-create-k2shf" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.658902 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b589-account-create-k2shf" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.707277 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.880927 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3aef9d-9026-440f-a163-c1caaefb69a3-combined-ca-bundle\") pod \"2f3aef9d-9026-440f-a163-c1caaefb69a3\" (UID: \"2f3aef9d-9026-440f-a163-c1caaefb69a3\") " Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.881081 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f3aef9d-9026-440f-a163-c1caaefb69a3-config-data\") pod \"2f3aef9d-9026-440f-a163-c1caaefb69a3\" (UID: \"2f3aef9d-9026-440f-a163-c1caaefb69a3\") " Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.881162 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhmwq\" (UniqueName: \"kubernetes.io/projected/2f3aef9d-9026-440f-a163-c1caaefb69a3-kube-api-access-bhmwq\") pod \"2f3aef9d-9026-440f-a163-c1caaefb69a3\" (UID: \"2f3aef9d-9026-440f-a163-c1caaefb69a3\") " Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.886884 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f3aef9d-9026-440f-a163-c1caaefb69a3-kube-api-access-bhmwq" (OuterVolumeSpecName: "kube-api-access-bhmwq") pod "2f3aef9d-9026-440f-a163-c1caaefb69a3" (UID: "2f3aef9d-9026-440f-a163-c1caaefb69a3"). InnerVolumeSpecName "kube-api-access-bhmwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.907215 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f3aef9d-9026-440f-a163-c1caaefb69a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f3aef9d-9026-440f-a163-c1caaefb69a3" (UID: "2f3aef9d-9026-440f-a163-c1caaefb69a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.957450 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f3aef9d-9026-440f-a163-c1caaefb69a3-config-data" (OuterVolumeSpecName: "config-data") pod "2f3aef9d-9026-440f-a163-c1caaefb69a3" (UID: "2f3aef9d-9026-440f-a163-c1caaefb69a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.966094 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cd31-account-create-pmq92"] Oct 06 08:38:53 crc kubenswrapper[4991]: W1006 08:38:53.973953 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c3d0945_d5b0_43bc_9ecf_0c0023ba2566.slice/crio-385a9e9a79e339b48d81c69969e4e639a5c7326aecbaf186fe9f5b43fb52abf7 WatchSource:0}: Error finding container 385a9e9a79e339b48d81c69969e4e639a5c7326aecbaf186fe9f5b43fb52abf7: Status 404 returned error can't find the container with id 385a9e9a79e339b48d81c69969e4e639a5c7326aecbaf186fe9f5b43fb52abf7 Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.982471 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhmwq\" (UniqueName: \"kubernetes.io/projected/2f3aef9d-9026-440f-a163-c1caaefb69a3-kube-api-access-bhmwq\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.982506 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3aef9d-9026-440f-a163-c1caaefb69a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:53 crc kubenswrapper[4991]: I1006 08:38:53.982517 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f3aef9d-9026-440f-a163-c1caaefb69a3-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.142204 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b589-account-create-k2shf"] Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.406972 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cd31-account-create-pmq92" event={"ID":"5c3d0945-d5b0-43bc-9ecf-0c0023ba2566","Type":"ContainerStarted","Data":"385a9e9a79e339b48d81c69969e4e639a5c7326aecbaf186fe9f5b43fb52abf7"} Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.408248 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b589-account-create-k2shf" event={"ID":"d4b6a26f-f25f-401d-a645-e94f9815314c","Type":"ContainerStarted","Data":"34b1401c492b964ba74cefce6c1d15b1bf4a20ada96b00b9692f9daf0d05ef04"} Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.411496 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vwxnk" event={"ID":"2f3aef9d-9026-440f-a163-c1caaefb69a3","Type":"ContainerDied","Data":"461082205d2af56b63eb25ded2ea859b4fe4f0fea2cefef1fe93205c74439a41"} Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.411624 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="461082205d2af56b63eb25ded2ea859b4fe4f0fea2cefef1fe93205c74439a41" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.411534 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vwxnk" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.624661 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-nzwtn"] Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.653429 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wjsnl"] Oct 06 08:38:54 crc kubenswrapper[4991]: E1006 08:38:54.653781 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f3aef9d-9026-440f-a163-c1caaefb69a3" containerName="keystone-db-sync" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.653797 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f3aef9d-9026-440f-a163-c1caaefb69a3" containerName="keystone-db-sync" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.653997 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f3aef9d-9026-440f-a163-c1caaefb69a3" containerName="keystone-db-sync" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.654501 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.657273 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.657609 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.657962 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-45scd" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.658218 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.673924 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wjsnl"] Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.728639 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-l7969"] Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.729962 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.826493 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-l7969"] Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.827258 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-combined-ca-bundle\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.827310 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-fernet-keys\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.827338 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.827359 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thrnf\" (UniqueName: \"kubernetes.io/projected/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-kube-api-access-thrnf\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.827442 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-credential-keys\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.827482 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-config\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.827504 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.827518 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.827538 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.827563 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-config-data\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.827594 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-scripts\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.827613 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbk99\" (UniqueName: \"kubernetes.io/projected/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-kube-api-access-sbk99\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.928871 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-credential-keys\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.929193 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-config\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.929311 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.929419 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.929512 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.929608 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-config-data\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.929701 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-scripts\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.929789 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbk99\" (UniqueName: \"kubernetes.io/projected/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-kube-api-access-sbk99\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.929900 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-combined-ca-bundle\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.930010 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-fernet-keys\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.930123 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.930214 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thrnf\" (UniqueName: \"kubernetes.io/projected/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-kube-api-access-thrnf\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.930798 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.931133 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-config\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.931673 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.932019 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.932204 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.939588 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-scripts\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.940005 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-fernet-keys\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.940516 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-combined-ca-bundle\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.940695 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-config-data\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.947640 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-credential-keys\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.955275 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbk99\" (UniqueName: \"kubernetes.io/projected/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-kube-api-access-sbk99\") pod \"keystone-bootstrap-wjsnl\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.959361 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thrnf\" (UniqueName: \"kubernetes.io/projected/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-kube-api-access-thrnf\") pod \"dnsmasq-dns-6c9c9f998c-l7969\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.993826 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:38:54 crc kubenswrapper[4991]: I1006 08:38:54.998850 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.005042 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.009385 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.015388 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.032351 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.032401 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.032495 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-log-httpd\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.032533 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl2nm\" (UniqueName: \"kubernetes.io/projected/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-kube-api-access-nl2nm\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.032557 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-config-data\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.032622 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-run-httpd\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.032647 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-scripts\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.063452 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-l7969"] Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.064204 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.074417 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-4mpdq"] Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.075721 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.080162 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7z8nn" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.080629 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.080824 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.086691 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4mpdq"] Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.089341 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.094787 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-5v2tc"] Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.096289 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.116396 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-5v2tc"] Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.136792 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.136840 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.136913 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-config\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.136993 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-run-httpd\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137018 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-scripts\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137056 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d139f7e8-c126-43bf-9a26-7692b455412b-logs\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137095 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86gc4\" (UniqueName: \"kubernetes.io/projected/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-kube-api-access-86gc4\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137133 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-scripts\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137160 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137182 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137210 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-combined-ca-bundle\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137248 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137325 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6xq8\" (UniqueName: \"kubernetes.io/projected/d139f7e8-c126-43bf-9a26-7692b455412b-kube-api-access-d6xq8\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137360 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-log-httpd\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137381 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137411 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-config-data\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137437 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl2nm\" (UniqueName: \"kubernetes.io/projected/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-kube-api-access-nl2nm\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137438 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-run-httpd\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137466 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-config-data\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.137671 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-log-httpd\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.141286 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.151010 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.151892 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-config-data\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.160416 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-scripts\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.162436 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl2nm\" (UniqueName: \"kubernetes.io/projected/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-kube-api-access-nl2nm\") pod \"ceilometer-0\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.239227 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.239273 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.239316 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-config\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.239362 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d139f7e8-c126-43bf-9a26-7692b455412b-logs\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.239412 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86gc4\" (UniqueName: \"kubernetes.io/projected/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-kube-api-access-86gc4\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.239438 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-scripts\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.239463 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-combined-ca-bundle\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.239480 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.239526 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6xq8\" (UniqueName: \"kubernetes.io/projected/d139f7e8-c126-43bf-9a26-7692b455412b-kube-api-access-d6xq8\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.239559 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.239583 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-config-data\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.240230 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d139f7e8-c126-43bf-9a26-7692b455412b-logs\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.240882 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.240900 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.241731 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.241756 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-config\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.242667 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.243736 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-scripts\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.246852 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-combined-ca-bundle\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.247357 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-config-data\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.260343 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6xq8\" (UniqueName: \"kubernetes.io/projected/d139f7e8-c126-43bf-9a26-7692b455412b-kube-api-access-d6xq8\") pod \"placement-db-sync-4mpdq\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.261100 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86gc4\" (UniqueName: \"kubernetes.io/projected/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-kube-api-access-86gc4\") pod \"dnsmasq-dns-57c957c4ff-5v2tc\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.355229 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.427673 4991 generic.go:334] "Generic (PLEG): container finished" podID="d4b6a26f-f25f-401d-a645-e94f9815314c" containerID="5ad03ef6a51021ee9836ed0f5b8afab91ba8065dfaba83a0ae1d7aef99eba78b" exitCode=0 Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.427902 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b589-account-create-k2shf" event={"ID":"d4b6a26f-f25f-401d-a645-e94f9815314c","Type":"ContainerDied","Data":"5ad03ef6a51021ee9836ed0f5b8afab91ba8065dfaba83a0ae1d7aef99eba78b"} Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.435416 4991 generic.go:334] "Generic (PLEG): container finished" podID="5c3d0945-d5b0-43bc-9ecf-0c0023ba2566" containerID="0d0a7b7be490409ea510848f2bbc97f380e6856575ff17ed7973b25771f88cfc" exitCode=0 Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.435631 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" podUID="7e421039-325e-471a-9b26-3ef3ed347c77" containerName="dnsmasq-dns" containerID="cri-o://83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0" gracePeriod=10 Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.435912 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cd31-account-create-pmq92" event={"ID":"5c3d0945-d5b0-43bc-9ecf-0c0023ba2566","Type":"ContainerDied","Data":"0d0a7b7be490409ea510848f2bbc97f380e6856575ff17ed7973b25771f88cfc"} Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.494688 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4mpdq" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.503741 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.667370 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-l7969"] Oct 06 08:38:55 crc kubenswrapper[4991]: W1006 08:38:55.676867 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbed91f17_96d1_4f7d_8f38_0279ec5c8c3c.slice/crio-9b2dd5086684e37734702c59358138ef587eda7527730b626846f5de93e756d4 WatchSource:0}: Error finding container 9b2dd5086684e37734702c59358138ef587eda7527730b626846f5de93e756d4: Status 404 returned error can't find the container with id 9b2dd5086684e37734702c59358138ef587eda7527730b626846f5de93e756d4 Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.767754 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wjsnl"] Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.838756 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.842280 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.847491 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.847623 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.847909 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lt5hb" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.848877 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.853952 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.916775 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.953421 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.964495 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.964799 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.982474 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b24f06db-f8f9-48df-8ea7-69dea0d33c26-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.982546 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.986819 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.986956 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5vhr\" (UniqueName: \"kubernetes.io/projected/b24f06db-f8f9-48df-8ea7-69dea0d33c26-kube-api-access-l5vhr\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.986998 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.987105 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-config-data\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.987211 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-scripts\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:55 crc kubenswrapper[4991]: I1006 08:38:55.987339 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b24f06db-f8f9-48df-8ea7-69dea0d33c26-logs\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.039006 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.054713 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.057773 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.091481 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-logs\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.091572 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.091673 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b24f06db-f8f9-48df-8ea7-69dea0d33c26-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.091749 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.091823 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.091941 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.092323 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.092573 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.092755 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zh9r\" (UniqueName: \"kubernetes.io/projected/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-kube-api-access-5zh9r\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.092984 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5vhr\" (UniqueName: \"kubernetes.io/projected/b24f06db-f8f9-48df-8ea7-69dea0d33c26-kube-api-access-l5vhr\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.093058 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.093142 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-config-data\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.093426 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.093499 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-scripts\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.093571 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.093639 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b24f06db-f8f9-48df-8ea7-69dea0d33c26-logs\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.096427 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.096967 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b24f06db-f8f9-48df-8ea7-69dea0d33c26-logs\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.097622 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b24f06db-f8f9-48df-8ea7-69dea0d33c26-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.108007 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-config-data\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.109009 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-scripts\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.114242 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.114403 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.115535 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5vhr\" (UniqueName: \"kubernetes.io/projected/b24f06db-f8f9-48df-8ea7-69dea0d33c26-kube-api-access-l5vhr\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.166121 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.195422 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-ovsdbserver-sb\") pod \"7e421039-325e-471a-9b26-3ef3ed347c77\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.195542 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-dns-svc\") pod \"7e421039-325e-471a-9b26-3ef3ed347c77\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.195648 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-ovsdbserver-nb\") pod \"7e421039-325e-471a-9b26-3ef3ed347c77\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.195695 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-dns-swift-storage-0\") pod \"7e421039-325e-471a-9b26-3ef3ed347c77\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.195747 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfk9k\" (UniqueName: \"kubernetes.io/projected/7e421039-325e-471a-9b26-3ef3ed347c77-kube-api-access-gfk9k\") pod \"7e421039-325e-471a-9b26-3ef3ed347c77\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.195772 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-config\") pod \"7e421039-325e-471a-9b26-3ef3ed347c77\" (UID: \"7e421039-325e-471a-9b26-3ef3ed347c77\") " Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.196018 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.196118 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.196175 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.196240 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zh9r\" (UniqueName: \"kubernetes.io/projected/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-kube-api-access-5zh9r\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.196334 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.196375 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.196424 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-logs\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.196461 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.197134 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.197984 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.198943 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.199069 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-logs\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.201570 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.205587 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e421039-325e-471a-9b26-3ef3ed347c77-kube-api-access-gfk9k" (OuterVolumeSpecName: "kube-api-access-gfk9k") pod "7e421039-325e-471a-9b26-3ef3ed347c77" (UID: "7e421039-325e-471a-9b26-3ef3ed347c77"). InnerVolumeSpecName "kube-api-access-gfk9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.206435 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.206983 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.219715 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.221044 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zh9r\" (UniqueName: \"kubernetes.io/projected/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-kube-api-access-5zh9r\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.245726 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.254798 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4mpdq"] Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.264610 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-5v2tc"] Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.268992 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e421039-325e-471a-9b26-3ef3ed347c77" (UID: "7e421039-325e-471a-9b26-3ef3ed347c77"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.277015 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e421039-325e-471a-9b26-3ef3ed347c77" (UID: "7e421039-325e-471a-9b26-3ef3ed347c77"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.291991 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-config" (OuterVolumeSpecName: "config") pod "7e421039-325e-471a-9b26-3ef3ed347c77" (UID: "7e421039-325e-471a-9b26-3ef3ed347c77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.292597 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e421039-325e-471a-9b26-3ef3ed347c77" (UID: "7e421039-325e-471a-9b26-3ef3ed347c77"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.298433 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfk9k\" (UniqueName: \"kubernetes.io/projected/7e421039-325e-471a-9b26-3ef3ed347c77-kube-api-access-gfk9k\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.298472 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.298488 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.298503 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.298516 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.304177 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e421039-325e-471a-9b26-3ef3ed347c77" (UID: "7e421039-325e-471a-9b26-3ef3ed347c77"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.343745 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.409784 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e421039-325e-471a-9b26-3ef3ed347c77-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.464519 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7","Type":"ContainerStarted","Data":"0ccdea44e2cd5d77caed8416ae9b1bfe9c4e20489e228fe015bba5a525df56d4"} Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.465926 4991 generic.go:334] "Generic (PLEG): container finished" podID="bed91f17-96d1-4f7d-8f38-0279ec5c8c3c" containerID="04eb3b04a75105f518df73d912d22158916ebc3215fd8b6a9435bf0af863cf5c" exitCode=0 Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.465992 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" event={"ID":"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c","Type":"ContainerDied","Data":"04eb3b04a75105f518df73d912d22158916ebc3215fd8b6a9435bf0af863cf5c"} Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.466026 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" event={"ID":"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c","Type":"ContainerStarted","Data":"9b2dd5086684e37734702c59358138ef587eda7527730b626846f5de93e756d4"} Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.480917 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" event={"ID":"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7","Type":"ContainerStarted","Data":"431c1d1856e249ecd350ad2d0a4bfa30ce2dece9f342471c9a14615743a7ff08"} Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.482250 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4mpdq" event={"ID":"d139f7e8-c126-43bf-9a26-7692b455412b","Type":"ContainerStarted","Data":"c9dd7fd1b1482bc75371e10e6f5e7ef3e3f85c0927abad773f6305372c67dda9"} Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.491479 4991 generic.go:334] "Generic (PLEG): container finished" podID="7e421039-325e-471a-9b26-3ef3ed347c77" containerID="83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0" exitCode=0 Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.491562 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" event={"ID":"7e421039-325e-471a-9b26-3ef3ed347c77","Type":"ContainerDied","Data":"83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0"} Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.491580 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.491606 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-nzwtn" event={"ID":"7e421039-325e-471a-9b26-3ef3ed347c77","Type":"ContainerDied","Data":"29936ee31624e96f5ecf4a3f72bcc90e163c9acab9b5115c8f3eb6a9fbc56ced"} Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.491629 4991 scope.go:117] "RemoveContainer" containerID="83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.535374 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wjsnl" event={"ID":"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5","Type":"ContainerStarted","Data":"93bde15692d80421b438da8a25bddff9e4c4228214885a56f99ff4f63cea895f"} Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.535427 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wjsnl" event={"ID":"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5","Type":"ContainerStarted","Data":"9bf99f731fc16ae28fd3ead547de6934bd08ea07c7fc2f57c90babe0d4d64833"} Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.571247 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wjsnl" podStartSLOduration=2.5712245879999998 podStartE2EDuration="2.571224588s" podCreationTimestamp="2025-10-06 08:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:56.560150756 +0000 UTC m=+1188.297900777" watchObservedRunningTime="2025-10-06 08:38:56.571224588 +0000 UTC m=+1188.308974609" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.756410 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-nzwtn"] Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.758402 4991 scope.go:117] "RemoveContainer" containerID="9eb151f0553e7eeeb97a929db843bc43c8fe5eb5225732611708a5c655720921" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.761094 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-nzwtn"] Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.841855 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.852896 4991 scope.go:117] "RemoveContainer" containerID="83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0" Oct 06 08:38:56 crc kubenswrapper[4991]: E1006 08:38:56.856282 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0\": container with ID starting with 83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0 not found: ID does not exist" containerID="83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.856345 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0"} err="failed to get container status \"83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0\": rpc error: code = NotFound desc = could not find container \"83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0\": container with ID starting with 83e5268deff7e0af857feed14af18eb3cd035049bd3eff9e34302238e0c510a0 not found: ID does not exist" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.856410 4991 scope.go:117] "RemoveContainer" containerID="9eb151f0553e7eeeb97a929db843bc43c8fe5eb5225732611708a5c655720921" Oct 06 08:38:56 crc kubenswrapper[4991]: E1006 08:38:56.856714 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb151f0553e7eeeb97a929db843bc43c8fe5eb5225732611708a5c655720921\": container with ID starting with 9eb151f0553e7eeeb97a929db843bc43c8fe5eb5225732611708a5c655720921 not found: ID does not exist" containerID="9eb151f0553e7eeeb97a929db843bc43c8fe5eb5225732611708a5c655720921" Oct 06 08:38:56 crc kubenswrapper[4991]: I1006 08:38:56.856742 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb151f0553e7eeeb97a929db843bc43c8fe5eb5225732611708a5c655720921"} err="failed to get container status \"9eb151f0553e7eeeb97a929db843bc43c8fe5eb5225732611708a5c655720921\": rpc error: code = NotFound desc = could not find container \"9eb151f0553e7eeeb97a929db843bc43c8fe5eb5225732611708a5c655720921\": container with ID starting with 9eb151f0553e7eeeb97a929db843bc43c8fe5eb5225732611708a5c655720921 not found: ID does not exist" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.111021 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.227141 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.237217 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cd31-account-create-pmq92" Oct 06 08:38:57 crc kubenswrapper[4991]: W1006 08:38:57.238936 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8e63ee5_2b7c_488b_b25c_4ffd11e0e29d.slice/crio-49860fe7a195d7931334a084126ad5a9072b29a097d6bd7d23b29dbd92bed983 WatchSource:0}: Error finding container 49860fe7a195d7931334a084126ad5a9072b29a097d6bd7d23b29dbd92bed983: Status 404 returned error can't find the container with id 49860fe7a195d7931334a084126ad5a9072b29a097d6bd7d23b29dbd92bed983 Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.249527 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b589-account-create-k2shf" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.250086 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-config\") pod \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.250157 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-ovsdbserver-nb\") pod \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.250184 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-dns-svc\") pod \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.250210 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thrnf\" (UniqueName: \"kubernetes.io/projected/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-kube-api-access-thrnf\") pod \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.250331 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-dns-swift-storage-0\") pod \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.250414 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-ovsdbserver-sb\") pod \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\" (UID: \"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c\") " Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.282854 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e421039-325e-471a-9b26-3ef3ed347c77" path="/var/lib/kubelet/pods/7e421039-325e-471a-9b26-3ef3ed347c77/volumes" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.288288 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-config" (OuterVolumeSpecName: "config") pod "bed91f17-96d1-4f7d-8f38-0279ec5c8c3c" (UID: "bed91f17-96d1-4f7d-8f38-0279ec5c8c3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.299589 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bed91f17-96d1-4f7d-8f38-0279ec5c8c3c" (UID: "bed91f17-96d1-4f7d-8f38-0279ec5c8c3c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.313897 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-kube-api-access-thrnf" (OuterVolumeSpecName: "kube-api-access-thrnf") pod "bed91f17-96d1-4f7d-8f38-0279ec5c8c3c" (UID: "bed91f17-96d1-4f7d-8f38-0279ec5c8c3c"). InnerVolumeSpecName "kube-api-access-thrnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.320199 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bed91f17-96d1-4f7d-8f38-0279ec5c8c3c" (UID: "bed91f17-96d1-4f7d-8f38-0279ec5c8c3c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.343198 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bed91f17-96d1-4f7d-8f38-0279ec5c8c3c" (UID: "bed91f17-96d1-4f7d-8f38-0279ec5c8c3c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.353001 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvwq7\" (UniqueName: \"kubernetes.io/projected/d4b6a26f-f25f-401d-a645-e94f9815314c-kube-api-access-dvwq7\") pod \"d4b6a26f-f25f-401d-a645-e94f9815314c\" (UID: \"d4b6a26f-f25f-401d-a645-e94f9815314c\") " Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.353144 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lsnn\" (UniqueName: \"kubernetes.io/projected/5c3d0945-d5b0-43bc-9ecf-0c0023ba2566-kube-api-access-9lsnn\") pod \"5c3d0945-d5b0-43bc-9ecf-0c0023ba2566\" (UID: \"5c3d0945-d5b0-43bc-9ecf-0c0023ba2566\") " Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.353548 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.353560 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.353569 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.353578 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.353587 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thrnf\" (UniqueName: \"kubernetes.io/projected/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-kube-api-access-thrnf\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.356910 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bed91f17-96d1-4f7d-8f38-0279ec5c8c3c" (UID: "bed91f17-96d1-4f7d-8f38-0279ec5c8c3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.357535 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b6a26f-f25f-401d-a645-e94f9815314c-kube-api-access-dvwq7" (OuterVolumeSpecName: "kube-api-access-dvwq7") pod "d4b6a26f-f25f-401d-a645-e94f9815314c" (UID: "d4b6a26f-f25f-401d-a645-e94f9815314c"). InnerVolumeSpecName "kube-api-access-dvwq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.357731 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c3d0945-d5b0-43bc-9ecf-0c0023ba2566-kube-api-access-9lsnn" (OuterVolumeSpecName: "kube-api-access-9lsnn") pod "5c3d0945-d5b0-43bc-9ecf-0c0023ba2566" (UID: "5c3d0945-d5b0-43bc-9ecf-0c0023ba2566"). InnerVolumeSpecName "kube-api-access-9lsnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.455502 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lsnn\" (UniqueName: \"kubernetes.io/projected/5c3d0945-d5b0-43bc-9ecf-0c0023ba2566-kube-api-access-9lsnn\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.455528 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.455538 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvwq7\" (UniqueName: \"kubernetes.io/projected/d4b6a26f-f25f-401d-a645-e94f9815314c-kube-api-access-dvwq7\") on node \"crc\" DevicePath \"\"" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.553857 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b24f06db-f8f9-48df-8ea7-69dea0d33c26","Type":"ContainerStarted","Data":"ebc8106f6a691b71a9869a764758d338b221fd93f2fc6e43f61b2e488a56db4a"} Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.559053 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cd31-account-create-pmq92" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.559362 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cd31-account-create-pmq92" event={"ID":"5c3d0945-d5b0-43bc-9ecf-0c0023ba2566","Type":"ContainerDied","Data":"385a9e9a79e339b48d81c69969e4e639a5c7326aecbaf186fe9f5b43fb52abf7"} Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.559419 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="385a9e9a79e339b48d81c69969e4e639a5c7326aecbaf186fe9f5b43fb52abf7" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.592795 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b589-account-create-k2shf" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.592808 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b589-account-create-k2shf" event={"ID":"d4b6a26f-f25f-401d-a645-e94f9815314c","Type":"ContainerDied","Data":"34b1401c492b964ba74cefce6c1d15b1bf4a20ada96b00b9692f9daf0d05ef04"} Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.593168 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34b1401c492b964ba74cefce6c1d15b1bf4a20ada96b00b9692f9daf0d05ef04" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.603677 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" event={"ID":"bed91f17-96d1-4f7d-8f38-0279ec5c8c3c","Type":"ContainerDied","Data":"9b2dd5086684e37734702c59358138ef587eda7527730b626846f5de93e756d4"} Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.603735 4991 scope.go:117] "RemoveContainer" containerID="04eb3b04a75105f518df73d912d22158916ebc3215fd8b6a9435bf0af863cf5c" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.603855 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-l7969" Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.613770 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d","Type":"ContainerStarted","Data":"49860fe7a195d7931334a084126ad5a9072b29a097d6bd7d23b29dbd92bed983"} Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.625240 4991 generic.go:334] "Generic (PLEG): container finished" podID="bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" containerID="524550fb96c879169babacbf6f1cd0e64e6e9d9caac08b3a69e95c63c429b2e8" exitCode=0 Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.626538 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" event={"ID":"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7","Type":"ContainerDied","Data":"524550fb96c879169babacbf6f1cd0e64e6e9d9caac08b3a69e95c63c429b2e8"} Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.815307 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-l7969"] Oct 06 08:38:57 crc kubenswrapper[4991]: I1006 08:38:57.829979 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-l7969"] Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.106049 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.143979 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.196947 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.651615 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d","Type":"ContainerStarted","Data":"0a355feb446eebc623fcb55c6ebd765cb1db8e250a80ee5601812066b9b4b008"} Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.654281 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" event={"ID":"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7","Type":"ContainerStarted","Data":"95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a"} Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.654358 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.654433 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-52tpz"] Oct 06 08:38:58 crc kubenswrapper[4991]: E1006 08:38:58.654974 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed91f17-96d1-4f7d-8f38-0279ec5c8c3c" containerName="init" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.654991 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed91f17-96d1-4f7d-8f38-0279ec5c8c3c" containerName="init" Oct 06 08:38:58 crc kubenswrapper[4991]: E1006 08:38:58.655019 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3d0945-d5b0-43bc-9ecf-0c0023ba2566" containerName="mariadb-account-create" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.655028 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3d0945-d5b0-43bc-9ecf-0c0023ba2566" containerName="mariadb-account-create" Oct 06 08:38:58 crc kubenswrapper[4991]: E1006 08:38:58.655060 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e421039-325e-471a-9b26-3ef3ed347c77" containerName="dnsmasq-dns" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.655067 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e421039-325e-471a-9b26-3ef3ed347c77" containerName="dnsmasq-dns" Oct 06 08:38:58 crc kubenswrapper[4991]: E1006 08:38:58.655084 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b6a26f-f25f-401d-a645-e94f9815314c" containerName="mariadb-account-create" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.655092 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b6a26f-f25f-401d-a645-e94f9815314c" containerName="mariadb-account-create" Oct 06 08:38:58 crc kubenswrapper[4991]: E1006 08:38:58.655101 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e421039-325e-471a-9b26-3ef3ed347c77" containerName="init" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.655108 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e421039-325e-471a-9b26-3ef3ed347c77" containerName="init" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.655273 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="bed91f17-96d1-4f7d-8f38-0279ec5c8c3c" containerName="init" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.655333 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e421039-325e-471a-9b26-3ef3ed347c77" containerName="dnsmasq-dns" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.655341 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c3d0945-d5b0-43bc-9ecf-0c0023ba2566" containerName="mariadb-account-create" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.655350 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b6a26f-f25f-401d-a645-e94f9815314c" containerName="mariadb-account-create" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.655997 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.658594 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tpjld" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.659356 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.659975 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b24f06db-f8f9-48df-8ea7-69dea0d33c26","Type":"ContainerStarted","Data":"e51f10094de2ce556cb3ea62528c79089f7659f0c40ed3dd871afea44b36b894"} Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.661332 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.663761 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-52tpz"] Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.682979 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" podStartSLOduration=3.682952052 podStartE2EDuration="3.682952052s" podCreationTimestamp="2025-10-06 08:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:58.674509684 +0000 UTC m=+1190.412259705" watchObservedRunningTime="2025-10-06 08:38:58.682952052 +0000 UTC m=+1190.420702073" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.779034 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-etc-machine-id\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.779108 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-db-sync-config-data\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.779223 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-combined-ca-bundle\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.779262 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9b2r\" (UniqueName: \"kubernetes.io/projected/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-kube-api-access-r9b2r\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.779359 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-scripts\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.779464 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-config-data\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.881244 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9b2r\" (UniqueName: \"kubernetes.io/projected/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-kube-api-access-r9b2r\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.881391 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-scripts\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.881457 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-config-data\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.881520 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-etc-machine-id\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.881545 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-db-sync-config-data\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.881619 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-combined-ca-bundle\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.882463 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-etc-machine-id\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.887174 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-scripts\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.888563 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-db-sync-config-data\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.893141 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-config-data\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.902841 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9b2r\" (UniqueName: \"kubernetes.io/projected/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-kube-api-access-r9b2r\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.904012 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-combined-ca-bundle\") pod \"cinder-db-sync-52tpz\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:58 crc kubenswrapper[4991]: I1006 08:38:58.996170 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-52tpz" Oct 06 08:38:59 crc kubenswrapper[4991]: I1006 08:38:59.258514 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed91f17-96d1-4f7d-8f38-0279ec5c8c3c" path="/var/lib/kubelet/pods/bed91f17-96d1-4f7d-8f38-0279ec5c8c3c/volumes" Oct 06 08:38:59 crc kubenswrapper[4991]: I1006 08:38:59.671316 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d","Type":"ContainerStarted","Data":"29e774cc7f543cdc313c0341989531c1a969bd5840959f94b73c3f8b4787ac18"} Oct 06 08:38:59 crc kubenswrapper[4991]: I1006 08:38:59.671479 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" containerName="glance-log" containerID="cri-o://0a355feb446eebc623fcb55c6ebd765cb1db8e250a80ee5601812066b9b4b008" gracePeriod=30 Oct 06 08:38:59 crc kubenswrapper[4991]: I1006 08:38:59.672106 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" containerName="glance-httpd" containerID="cri-o://29e774cc7f543cdc313c0341989531c1a969bd5840959f94b73c3f8b4787ac18" gracePeriod=30 Oct 06 08:38:59 crc kubenswrapper[4991]: I1006 08:38:59.675093 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b24f06db-f8f9-48df-8ea7-69dea0d33c26","Type":"ContainerStarted","Data":"5435ca2835848c5b834c5e98c3d00ce0e7ce54fd9d0e392993ac5deea2d4f7d5"} Oct 06 08:38:59 crc kubenswrapper[4991]: I1006 08:38:59.675253 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b24f06db-f8f9-48df-8ea7-69dea0d33c26" containerName="glance-log" containerID="cri-o://e51f10094de2ce556cb3ea62528c79089f7659f0c40ed3dd871afea44b36b894" gracePeriod=30 Oct 06 08:38:59 crc kubenswrapper[4991]: I1006 08:38:59.675284 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b24f06db-f8f9-48df-8ea7-69dea0d33c26" containerName="glance-httpd" containerID="cri-o://5435ca2835848c5b834c5e98c3d00ce0e7ce54fd9d0e392993ac5deea2d4f7d5" gracePeriod=30 Oct 06 08:38:59 crc kubenswrapper[4991]: I1006 08:38:59.695423 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.695404129 podStartE2EDuration="5.695404129s" podCreationTimestamp="2025-10-06 08:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:59.693438513 +0000 UTC m=+1191.431188544" watchObservedRunningTime="2025-10-06 08:38:59.695404129 +0000 UTC m=+1191.433154150" Oct 06 08:38:59 crc kubenswrapper[4991]: I1006 08:38:59.725890 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.725870538 podStartE2EDuration="5.725870538s" podCreationTimestamp="2025-10-06 08:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:38:59.723563583 +0000 UTC m=+1191.461313624" watchObservedRunningTime="2025-10-06 08:38:59.725870538 +0000 UTC m=+1191.463620559" Oct 06 08:39:00 crc kubenswrapper[4991]: I1006 08:39:00.689157 4991 generic.go:334] "Generic (PLEG): container finished" podID="e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5" containerID="93bde15692d80421b438da8a25bddff9e4c4228214885a56f99ff4f63cea895f" exitCode=0 Oct 06 08:39:00 crc kubenswrapper[4991]: I1006 08:39:00.689448 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wjsnl" event={"ID":"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5","Type":"ContainerDied","Data":"93bde15692d80421b438da8a25bddff9e4c4228214885a56f99ff4f63cea895f"} Oct 06 08:39:00 crc kubenswrapper[4991]: I1006 08:39:00.693564 4991 generic.go:334] "Generic (PLEG): container finished" podID="f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" containerID="29e774cc7f543cdc313c0341989531c1a969bd5840959f94b73c3f8b4787ac18" exitCode=0 Oct 06 08:39:00 crc kubenswrapper[4991]: I1006 08:39:00.693591 4991 generic.go:334] "Generic (PLEG): container finished" podID="f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" containerID="0a355feb446eebc623fcb55c6ebd765cb1db8e250a80ee5601812066b9b4b008" exitCode=143 Oct 06 08:39:00 crc kubenswrapper[4991]: I1006 08:39:00.693642 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d","Type":"ContainerDied","Data":"29e774cc7f543cdc313c0341989531c1a969bd5840959f94b73c3f8b4787ac18"} Oct 06 08:39:00 crc kubenswrapper[4991]: I1006 08:39:00.693676 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d","Type":"ContainerDied","Data":"0a355feb446eebc623fcb55c6ebd765cb1db8e250a80ee5601812066b9b4b008"} Oct 06 08:39:00 crc kubenswrapper[4991]: I1006 08:39:00.695886 4991 generic.go:334] "Generic (PLEG): container finished" podID="b24f06db-f8f9-48df-8ea7-69dea0d33c26" containerID="5435ca2835848c5b834c5e98c3d00ce0e7ce54fd9d0e392993ac5deea2d4f7d5" exitCode=0 Oct 06 08:39:00 crc kubenswrapper[4991]: I1006 08:39:00.695900 4991 generic.go:334] "Generic (PLEG): container finished" podID="b24f06db-f8f9-48df-8ea7-69dea0d33c26" containerID="e51f10094de2ce556cb3ea62528c79089f7659f0c40ed3dd871afea44b36b894" exitCode=143 Oct 06 08:39:00 crc kubenswrapper[4991]: I1006 08:39:00.695915 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b24f06db-f8f9-48df-8ea7-69dea0d33c26","Type":"ContainerDied","Data":"5435ca2835848c5b834c5e98c3d00ce0e7ce54fd9d0e392993ac5deea2d4f7d5"} Oct 06 08:39:00 crc kubenswrapper[4991]: I1006 08:39:00.695929 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b24f06db-f8f9-48df-8ea7-69dea0d33c26","Type":"ContainerDied","Data":"e51f10094de2ce556cb3ea62528c79089f7659f0c40ed3dd871afea44b36b894"} Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.413003 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-5sltb"] Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.414955 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.416761 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-snz5j" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.417227 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.422555 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5sltb"] Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.452554 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b842-account-create-cf7lf"] Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.461459 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b842-account-create-cf7lf" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.463666 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.465890 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b842-account-create-cf7lf"] Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.466638 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-combined-ca-bundle\") pod \"barbican-db-sync-5sltb\" (UID: \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\") " pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.466690 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-db-sync-config-data\") pod \"barbican-db-sync-5sltb\" (UID: \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\") " pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.466788 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29pdv\" (UniqueName: \"kubernetes.io/projected/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-kube-api-access-29pdv\") pod \"barbican-db-sync-5sltb\" (UID: \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\") " pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.568488 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-db-sync-config-data\") pod \"barbican-db-sync-5sltb\" (UID: \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\") " pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.568644 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29pdv\" (UniqueName: \"kubernetes.io/projected/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-kube-api-access-29pdv\") pod \"barbican-db-sync-5sltb\" (UID: \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\") " pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.568676 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w95s9\" (UniqueName: \"kubernetes.io/projected/adc11447-fe09-4d10-9d49-c064f5fffc7d-kube-api-access-w95s9\") pod \"neutron-b842-account-create-cf7lf\" (UID: \"adc11447-fe09-4d10-9d49-c064f5fffc7d\") " pod="openstack/neutron-b842-account-create-cf7lf" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.568734 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-combined-ca-bundle\") pod \"barbican-db-sync-5sltb\" (UID: \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\") " pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.575078 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-db-sync-config-data\") pod \"barbican-db-sync-5sltb\" (UID: \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\") " pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.575116 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-combined-ca-bundle\") pod \"barbican-db-sync-5sltb\" (UID: \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\") " pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.584627 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29pdv\" (UniqueName: \"kubernetes.io/projected/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-kube-api-access-29pdv\") pod \"barbican-db-sync-5sltb\" (UID: \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\") " pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.669981 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w95s9\" (UniqueName: \"kubernetes.io/projected/adc11447-fe09-4d10-9d49-c064f5fffc7d-kube-api-access-w95s9\") pod \"neutron-b842-account-create-cf7lf\" (UID: \"adc11447-fe09-4d10-9d49-c064f5fffc7d\") " pod="openstack/neutron-b842-account-create-cf7lf" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.689256 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w95s9\" (UniqueName: \"kubernetes.io/projected/adc11447-fe09-4d10-9d49-c064f5fffc7d-kube-api-access-w95s9\") pod \"neutron-b842-account-create-cf7lf\" (UID: \"adc11447-fe09-4d10-9d49-c064f5fffc7d\") " pod="openstack/neutron-b842-account-create-cf7lf" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.745912 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.789936 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b842-account-create-cf7lf" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.827405 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.850840 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.868647 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.872796 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.872855 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-scripts\") pod \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.872917 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-config-data\") pod \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.872963 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-combined-ca-bundle\") pod \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.872997 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-internal-tls-certs\") pod \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.873051 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-httpd-run\") pod \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.873137 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zh9r\" (UniqueName: \"kubernetes.io/projected/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-kube-api-access-5zh9r\") pod \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.873340 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-logs\") pod \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\" (UID: \"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.874749 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-logs" (OuterVolumeSpecName: "logs") pod "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" (UID: "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.877053 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" (UID: "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.880469 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" (UID: "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.881537 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-scripts" (OuterVolumeSpecName: "scripts") pod "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" (UID: "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.882682 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-kube-api-access-5zh9r" (OuterVolumeSpecName: "kube-api-access-5zh9r") pod "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" (UID: "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d"). InnerVolumeSpecName "kube-api-access-5zh9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.905694 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" (UID: "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.936356 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-config-data" (OuterVolumeSpecName: "config-data") pod "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" (UID: "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.945313 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" (UID: "f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.975409 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-config-data\") pod \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.975640 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-public-tls-certs\") pod \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.975698 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-config-data\") pod \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.975739 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-combined-ca-bundle\") pod \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.975777 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-scripts\") pod \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.975836 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-scripts\") pod \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.975873 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbk99\" (UniqueName: \"kubernetes.io/projected/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-kube-api-access-sbk99\") pod \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.975918 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5vhr\" (UniqueName: \"kubernetes.io/projected/b24f06db-f8f9-48df-8ea7-69dea0d33c26-kube-api-access-l5vhr\") pod \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.976433 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-combined-ca-bundle\") pod \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.976466 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b24f06db-f8f9-48df-8ea7-69dea0d33c26-logs\") pod \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.976492 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.976523 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-fernet-keys\") pod \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.976571 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-credential-keys\") pod \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\" (UID: \"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.976598 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b24f06db-f8f9-48df-8ea7-69dea0d33c26-httpd-run\") pod \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\" (UID: \"b24f06db-f8f9-48df-8ea7-69dea0d33c26\") " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.977067 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.977083 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.977095 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.977105 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.977116 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.977126 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.977136 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zh9r\" (UniqueName: \"kubernetes.io/projected/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-kube-api-access-5zh9r\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.977146 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.980382 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24f06db-f8f9-48df-8ea7-69dea0d33c26-logs" (OuterVolumeSpecName: "logs") pod "b24f06db-f8f9-48df-8ea7-69dea0d33c26" (UID: "b24f06db-f8f9-48df-8ea7-69dea0d33c26"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.984764 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24f06db-f8f9-48df-8ea7-69dea0d33c26-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b24f06db-f8f9-48df-8ea7-69dea0d33c26" (UID: "b24f06db-f8f9-48df-8ea7-69dea0d33c26"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.994428 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24f06db-f8f9-48df-8ea7-69dea0d33c26-kube-api-access-l5vhr" (OuterVolumeSpecName: "kube-api-access-l5vhr") pod "b24f06db-f8f9-48df-8ea7-69dea0d33c26" (UID: "b24f06db-f8f9-48df-8ea7-69dea0d33c26"). InnerVolumeSpecName "kube-api-access-l5vhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:03 crc kubenswrapper[4991]: I1006 08:39:03.995514 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "b24f06db-f8f9-48df-8ea7-69dea0d33c26" (UID: "b24f06db-f8f9-48df-8ea7-69dea0d33c26"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.000119 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-scripts" (OuterVolumeSpecName: "scripts") pod "b24f06db-f8f9-48df-8ea7-69dea0d33c26" (UID: "b24f06db-f8f9-48df-8ea7-69dea0d33c26"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.001217 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5" (UID: "e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.008410 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-scripts" (OuterVolumeSpecName: "scripts") pod "e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5" (UID: "e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.010181 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5" (UID: "e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.014904 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-kube-api-access-sbk99" (OuterVolumeSpecName: "kube-api-access-sbk99") pod "e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5" (UID: "e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5"). InnerVolumeSpecName "kube-api-access-sbk99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.077313 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-config-data" (OuterVolumeSpecName: "config-data") pod "e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5" (UID: "e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.078405 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.078428 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.078439 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbk99\" (UniqueName: \"kubernetes.io/projected/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-kube-api-access-sbk99\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.078463 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5vhr\" (UniqueName: \"kubernetes.io/projected/b24f06db-f8f9-48df-8ea7-69dea0d33c26-kube-api-access-l5vhr\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.078475 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b24f06db-f8f9-48df-8ea7-69dea0d33c26-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.078508 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.078518 4991 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.078528 4991 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.078536 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b24f06db-f8f9-48df-8ea7-69dea0d33c26-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.078545 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.080507 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.100833 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.111521 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-config-data" (OuterVolumeSpecName: "config-data") pod "b24f06db-f8f9-48df-8ea7-69dea0d33c26" (UID: "b24f06db-f8f9-48df-8ea7-69dea0d33c26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.124700 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5" (UID: "e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.134439 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b24f06db-f8f9-48df-8ea7-69dea0d33c26" (UID: "b24f06db-f8f9-48df-8ea7-69dea0d33c26"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.138932 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b24f06db-f8f9-48df-8ea7-69dea0d33c26" (UID: "b24f06db-f8f9-48df-8ea7-69dea0d33c26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.180121 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.180152 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.180161 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.180169 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.180178 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.180185 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b24f06db-f8f9-48df-8ea7-69dea0d33c26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:04 crc kubenswrapper[4991]: W1006 08:39:04.251683 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ae62f13_d5be_414e_a6f9_9b2e475afbd1.slice/crio-d6f3fc17ada598288a6a54ff08d21ccacf2eb549155e9cde49458a5d06e1c105 WatchSource:0}: Error finding container d6f3fc17ada598288a6a54ff08d21ccacf2eb549155e9cde49458a5d06e1c105: Status 404 returned error can't find the container with id d6f3fc17ada598288a6a54ff08d21ccacf2eb549155e9cde49458a5d06e1c105 Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.252879 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-52tpz"] Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.383414 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b842-account-create-cf7lf"] Oct 06 08:39:04 crc kubenswrapper[4991]: W1006 08:39:04.389482 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85987e42_3d5a_45e3_af5a_f1dd6b1bcfc5.slice/crio-835a71678b6487101d76f4781d5e48fe1b8c822b38c065fc4637078b65db9ccf WatchSource:0}: Error finding container 835a71678b6487101d76f4781d5e48fe1b8c822b38c065fc4637078b65db9ccf: Status 404 returned error can't find the container with id 835a71678b6487101d76f4781d5e48fe1b8c822b38c065fc4637078b65db9ccf Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.409816 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5sltb"] Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.740608 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wjsnl" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.740669 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wjsnl" event={"ID":"e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5","Type":"ContainerDied","Data":"9bf99f731fc16ae28fd3ead547de6934bd08ea07c7fc2f57c90babe0d4d64833"} Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.740743 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bf99f731fc16ae28fd3ead547de6934bd08ea07c7fc2f57c90babe0d4d64833" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.742663 4991 generic.go:334] "Generic (PLEG): container finished" podID="adc11447-fe09-4d10-9d49-c064f5fffc7d" containerID="e7be98d10e1a8ee5623408c860107ad0dafba366dc2800ccdd3c919ef2c6078a" exitCode=0 Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.742721 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b842-account-create-cf7lf" event={"ID":"adc11447-fe09-4d10-9d49-c064f5fffc7d","Type":"ContainerDied","Data":"e7be98d10e1a8ee5623408c860107ad0dafba366dc2800ccdd3c919ef2c6078a"} Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.742749 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b842-account-create-cf7lf" event={"ID":"adc11447-fe09-4d10-9d49-c064f5fffc7d","Type":"ContainerStarted","Data":"98cc42ed0951bcff613b76371d003f38dfc3bd4ae6f312e4128a9ac3a7cb8194"} Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.745657 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7","Type":"ContainerStarted","Data":"e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651"} Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.749221 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d","Type":"ContainerDied","Data":"49860fe7a195d7931334a084126ad5a9072b29a097d6bd7d23b29dbd92bed983"} Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.749284 4991 scope.go:117] "RemoveContainer" containerID="29e774cc7f543cdc313c0341989531c1a969bd5840959f94b73c3f8b4787ac18" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.749572 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.751831 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4mpdq" event={"ID":"d139f7e8-c126-43bf-9a26-7692b455412b","Type":"ContainerStarted","Data":"855e698b4a89b7f90fca1d65066f42cf770d32b0ec2573bc09f0d5dcbde6d2e3"} Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.758928 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.760046 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b24f06db-f8f9-48df-8ea7-69dea0d33c26","Type":"ContainerDied","Data":"ebc8106f6a691b71a9869a764758d338b221fd93f2fc6e43f61b2e488a56db4a"} Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.772842 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5sltb" event={"ID":"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5","Type":"ContainerStarted","Data":"835a71678b6487101d76f4781d5e48fe1b8c822b38c065fc4637078b65db9ccf"} Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.777339 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-52tpz" event={"ID":"5ae62f13-d5be-414e-a6f9-9b2e475afbd1","Type":"ContainerStarted","Data":"d6f3fc17ada598288a6a54ff08d21ccacf2eb549155e9cde49458a5d06e1c105"} Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.801439 4991 scope.go:117] "RemoveContainer" containerID="0a355feb446eebc623fcb55c6ebd765cb1db8e250a80ee5601812066b9b4b008" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.819209 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-4mpdq" podStartSLOduration=2.301401456 podStartE2EDuration="9.819186872s" podCreationTimestamp="2025-10-06 08:38:55 +0000 UTC" firstStartedPulling="2025-10-06 08:38:56.263033898 +0000 UTC m=+1188.000783919" lastFinishedPulling="2025-10-06 08:39:03.780819314 +0000 UTC m=+1195.518569335" observedRunningTime="2025-10-06 08:39:04.799492497 +0000 UTC m=+1196.537242538" watchObservedRunningTime="2025-10-06 08:39:04.819186872 +0000 UTC m=+1196.556936913" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.824343 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.841687 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.853067 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:39:04 crc kubenswrapper[4991]: E1006 08:39:04.853548 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24f06db-f8f9-48df-8ea7-69dea0d33c26" containerName="glance-log" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.853574 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24f06db-f8f9-48df-8ea7-69dea0d33c26" containerName="glance-log" Oct 06 08:39:04 crc kubenswrapper[4991]: E1006 08:39:04.853593 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24f06db-f8f9-48df-8ea7-69dea0d33c26" containerName="glance-httpd" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.853604 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24f06db-f8f9-48df-8ea7-69dea0d33c26" containerName="glance-httpd" Oct 06 08:39:04 crc kubenswrapper[4991]: E1006 08:39:04.853627 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5" containerName="keystone-bootstrap" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.853637 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5" containerName="keystone-bootstrap" Oct 06 08:39:04 crc kubenswrapper[4991]: E1006 08:39:04.853651 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" containerName="glance-log" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.853658 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" containerName="glance-log" Oct 06 08:39:04 crc kubenswrapper[4991]: E1006 08:39:04.853686 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" containerName="glance-httpd" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.853695 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" containerName="glance-httpd" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.853914 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24f06db-f8f9-48df-8ea7-69dea0d33c26" containerName="glance-httpd" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.853929 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" containerName="glance-log" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.853949 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24f06db-f8f9-48df-8ea7-69dea0d33c26" containerName="glance-log" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.853966 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" containerName="glance-httpd" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.853979 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5" containerName="keystone-bootstrap" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.855096 4991 scope.go:117] "RemoveContainer" containerID="5435ca2835848c5b834c5e98c3d00ce0e7ce54fd9d0e392993ac5deea2d4f7d5" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.855105 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.860827 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.860885 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.861110 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lt5hb" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.860838 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.874971 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.885853 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.897890 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-logs\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.897948 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.897985 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-config-data\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.898013 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.898040 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.898062 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcflh\" (UniqueName: \"kubernetes.io/projected/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-kube-api-access-bcflh\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.898093 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.898144 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-scripts\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.900527 4991 scope.go:117] "RemoveContainer" containerID="e51f10094de2ce556cb3ea62528c79089f7659f0c40ed3dd871afea44b36b894" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.902589 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.919599 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.921232 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.923447 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.923648 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 08:39:04 crc kubenswrapper[4991]: I1006 08:39:04.948542 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.000015 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-scripts\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.000070 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.000123 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c526d349-85d0-4d1a-9994-4394742e3051-logs\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.000191 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.000544 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.000659 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-logs\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.000743 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.000780 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c526d349-85d0-4d1a-9994-4394742e3051-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.000850 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-config-data\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.000885 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.000925 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.000957 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcflh\" (UniqueName: \"kubernetes.io/projected/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-kube-api-access-bcflh\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.001011 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.001038 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km9zg\" (UniqueName: \"kubernetes.io/projected/c526d349-85d0-4d1a-9994-4394742e3051-kube-api-access-km9zg\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.001609 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-logs\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.001724 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.001763 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.001806 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.006253 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-scripts\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.009554 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.010409 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.010801 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-config-data\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.012602 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.016223 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcflh\" (UniqueName: \"kubernetes.io/projected/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-kube-api-access-bcflh\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.042990 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.067274 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wjsnl"] Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.072772 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wjsnl"] Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.105116 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.105169 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c526d349-85d0-4d1a-9994-4394742e3051-logs\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.105201 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.105224 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.105268 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c526d349-85d0-4d1a-9994-4394742e3051-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.105406 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km9zg\" (UniqueName: \"kubernetes.io/projected/c526d349-85d0-4d1a-9994-4394742e3051-kube-api-access-km9zg\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.105436 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.105457 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.106195 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c526d349-85d0-4d1a-9994-4394742e3051-logs\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.106448 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c526d349-85d0-4d1a-9994-4394742e3051-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.106692 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.115222 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.115369 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.115977 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.117663 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.129710 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km9zg\" (UniqueName: \"kubernetes.io/projected/c526d349-85d0-4d1a-9994-4394742e3051-kube-api-access-km9zg\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.174048 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7pz2m"] Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.175663 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.175738 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.180738 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.186543 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.187093 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-45scd" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.187317 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.187516 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.204728 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7pz2m"] Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.207445 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-fernet-keys\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.207569 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g6k4\" (UniqueName: \"kubernetes.io/projected/c0dc26d4-eaf8-4419-a5e9-82e40496890b-kube-api-access-6g6k4\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.207616 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-scripts\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.207671 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-config-data\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.207734 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-credential-keys\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.207767 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-combined-ca-bundle\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.256273 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.273950 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24f06db-f8f9-48df-8ea7-69dea0d33c26" path="/var/lib/kubelet/pods/b24f06db-f8f9-48df-8ea7-69dea0d33c26/volumes" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.276445 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5" path="/var/lib/kubelet/pods/e9cb88d7-732f-4ff1-b9ab-b9a081d55fd5/volumes" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.277814 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d" path="/var/lib/kubelet/pods/f8e63ee5-2b7c-488b-b25c-4ffd11e0e29d/volumes" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.313232 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-credential-keys\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.313544 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-combined-ca-bundle\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.313621 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-fernet-keys\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.313645 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g6k4\" (UniqueName: \"kubernetes.io/projected/c0dc26d4-eaf8-4419-a5e9-82e40496890b-kube-api-access-6g6k4\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.313690 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-scripts\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.313787 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-config-data\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.320008 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-credential-keys\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.320943 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-combined-ca-bundle\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.327165 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-config-data\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.328936 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-fernet-keys\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.329631 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-scripts\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.360817 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g6k4\" (UniqueName: \"kubernetes.io/projected/c0dc26d4-eaf8-4419-a5e9-82e40496890b-kube-api-access-6g6k4\") pod \"keystone-bootstrap-7pz2m\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.506396 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.517236 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.585551 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-8h5bh"] Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.585805 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" podUID="ddd94528-deb5-46b2-b5c5-5aba9b33b05d" containerName="dnsmasq-dns" containerID="cri-o://ace95d799694a982e2861042f2798315a25cc282e7a43165e328ad10f02f4aa7" gracePeriod=10 Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.824159 4991 generic.go:334] "Generic (PLEG): container finished" podID="ddd94528-deb5-46b2-b5c5-5aba9b33b05d" containerID="ace95d799694a982e2861042f2798315a25cc282e7a43165e328ad10f02f4aa7" exitCode=0 Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.824245 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" event={"ID":"ddd94528-deb5-46b2-b5c5-5aba9b33b05d","Type":"ContainerDied","Data":"ace95d799694a982e2861042f2798315a25cc282e7a43165e328ad10f02f4aa7"} Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.913454 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:39:05 crc kubenswrapper[4991]: I1006 08:39:05.980065 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:39:06 crc kubenswrapper[4991]: W1006 08:39:06.000587 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc526d349_85d0_4d1a_9994_4394742e3051.slice/crio-9f7056fb53c728de0f27984386ce146258e84dd9e824c3816b338e72f6d109f2 WatchSource:0}: Error finding container 9f7056fb53c728de0f27984386ce146258e84dd9e824c3816b338e72f6d109f2: Status 404 returned error can't find the container with id 9f7056fb53c728de0f27984386ce146258e84dd9e824c3816b338e72f6d109f2 Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.300713 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.339876 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tgqt\" (UniqueName: \"kubernetes.io/projected/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-kube-api-access-4tgqt\") pod \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.340241 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-dns-swift-storage-0\") pod \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.340398 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-ovsdbserver-nb\") pod \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.340546 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-ovsdbserver-sb\") pod \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.340661 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-config\") pod \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.340772 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-dns-svc\") pod \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\" (UID: \"ddd94528-deb5-46b2-b5c5-5aba9b33b05d\") " Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.346542 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-kube-api-access-4tgqt" (OuterVolumeSpecName: "kube-api-access-4tgqt") pod "ddd94528-deb5-46b2-b5c5-5aba9b33b05d" (UID: "ddd94528-deb5-46b2-b5c5-5aba9b33b05d"). InnerVolumeSpecName "kube-api-access-4tgqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.429711 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ddd94528-deb5-46b2-b5c5-5aba9b33b05d" (UID: "ddd94528-deb5-46b2-b5c5-5aba9b33b05d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.443395 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tgqt\" (UniqueName: \"kubernetes.io/projected/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-kube-api-access-4tgqt\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.443424 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.447053 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ddd94528-deb5-46b2-b5c5-5aba9b33b05d" (UID: "ddd94528-deb5-46b2-b5c5-5aba9b33b05d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.450503 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ddd94528-deb5-46b2-b5c5-5aba9b33b05d" (UID: "ddd94528-deb5-46b2-b5c5-5aba9b33b05d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.493033 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ddd94528-deb5-46b2-b5c5-5aba9b33b05d" (UID: "ddd94528-deb5-46b2-b5c5-5aba9b33b05d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.517894 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-config" (OuterVolumeSpecName: "config") pod "ddd94528-deb5-46b2-b5c5-5aba9b33b05d" (UID: "ddd94528-deb5-46b2-b5c5-5aba9b33b05d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.519327 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7pz2m"] Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.545218 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.545248 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.545259 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.545269 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddd94528-deb5-46b2-b5c5-5aba9b33b05d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.856598 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f64668d5-09ed-4843-a117-ed6a3ae0d2ee","Type":"ContainerStarted","Data":"a2a6e712ac43e35d206bfa4081069a2b3747825540f3c6c13a96fca1c73ff32e"} Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.859406 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c526d349-85d0-4d1a-9994-4394742e3051","Type":"ContainerStarted","Data":"9f7056fb53c728de0f27984386ce146258e84dd9e824c3816b338e72f6d109f2"} Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.863094 4991 generic.go:334] "Generic (PLEG): container finished" podID="d139f7e8-c126-43bf-9a26-7692b455412b" containerID="855e698b4a89b7f90fca1d65066f42cf770d32b0ec2573bc09f0d5dcbde6d2e3" exitCode=0 Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.863173 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4mpdq" event={"ID":"d139f7e8-c126-43bf-9a26-7692b455412b","Type":"ContainerDied","Data":"855e698b4a89b7f90fca1d65066f42cf770d32b0ec2573bc09f0d5dcbde6d2e3"} Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.866378 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" event={"ID":"ddd94528-deb5-46b2-b5c5-5aba9b33b05d","Type":"ContainerDied","Data":"deb88501fdc75460d6f35d73bdded82846b9f946a33769d4569d0c000439c9d6"} Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.866409 4991 scope.go:117] "RemoveContainer" containerID="ace95d799694a982e2861042f2798315a25cc282e7a43165e328ad10f02f4aa7" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.866514 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-8h5bh" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.918463 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-8h5bh"] Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.925414 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-8h5bh"] Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.970709 4991 scope.go:117] "RemoveContainer" containerID="8acc746533318f731fb98e18db082b73437f09cba726085cb1598e3c4dc70e47" Oct 06 08:39:06 crc kubenswrapper[4991]: I1006 08:39:06.977398 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b842-account-create-cf7lf" Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.056053 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w95s9\" (UniqueName: \"kubernetes.io/projected/adc11447-fe09-4d10-9d49-c064f5fffc7d-kube-api-access-w95s9\") pod \"adc11447-fe09-4d10-9d49-c064f5fffc7d\" (UID: \"adc11447-fe09-4d10-9d49-c064f5fffc7d\") " Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.059458 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adc11447-fe09-4d10-9d49-c064f5fffc7d-kube-api-access-w95s9" (OuterVolumeSpecName: "kube-api-access-w95s9") pod "adc11447-fe09-4d10-9d49-c064f5fffc7d" (UID: "adc11447-fe09-4d10-9d49-c064f5fffc7d"). InnerVolumeSpecName "kube-api-access-w95s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.157255 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w95s9\" (UniqueName: \"kubernetes.io/projected/adc11447-fe09-4d10-9d49-c064f5fffc7d-kube-api-access-w95s9\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.282790 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd94528-deb5-46b2-b5c5-5aba9b33b05d" path="/var/lib/kubelet/pods/ddd94528-deb5-46b2-b5c5-5aba9b33b05d/volumes" Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.876737 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c526d349-85d0-4d1a-9994-4394742e3051","Type":"ContainerStarted","Data":"cd1debd69e908cac90a16b7137761847b47db6686c56a11e1822475462f4e431"} Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.877109 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c526d349-85d0-4d1a-9994-4394742e3051","Type":"ContainerStarted","Data":"63fb2fb78ee5a293703b11c47bf6171b4cd47f6081d25abdd8ec70aa73cb45ba"} Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.880535 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b842-account-create-cf7lf" event={"ID":"adc11447-fe09-4d10-9d49-c064f5fffc7d","Type":"ContainerDied","Data":"98cc42ed0951bcff613b76371d003f38dfc3bd4ae6f312e4128a9ac3a7cb8194"} Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.880581 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98cc42ed0951bcff613b76371d003f38dfc3bd4ae6f312e4128a9ac3a7cb8194" Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.880643 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b842-account-create-cf7lf" Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.893336 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7pz2m" event={"ID":"c0dc26d4-eaf8-4419-a5e9-82e40496890b","Type":"ContainerStarted","Data":"3664b0b86ced009ed293faa237b64fa88b76a10da99303a55d6b375dde2bab1c"} Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.893405 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7pz2m" event={"ID":"c0dc26d4-eaf8-4419-a5e9-82e40496890b","Type":"ContainerStarted","Data":"99f6f104a5428a79376e8e2d5e69f537cc7a547868dbc8aee5f68a79a522fd1e"} Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.897355 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7","Type":"ContainerStarted","Data":"b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7"} Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.904705 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f64668d5-09ed-4843-a117-ed6a3ae0d2ee","Type":"ContainerStarted","Data":"43699468a2106f54411eb7bcbafd2db8fadc0b2780109916319a4caf5809fc7e"} Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.904760 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f64668d5-09ed-4843-a117-ed6a3ae0d2ee","Type":"ContainerStarted","Data":"e6dbd55d04dccf656cb66469e6a3b3b8f6ada4a86993ee092d6bde6b3393ea11"} Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.909195 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.909168909 podStartE2EDuration="3.909168909s" podCreationTimestamp="2025-10-06 08:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:07.898949552 +0000 UTC m=+1199.636699573" watchObservedRunningTime="2025-10-06 08:39:07.909168909 +0000 UTC m=+1199.646918920" Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.929949 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7pz2m" podStartSLOduration=2.929932085 podStartE2EDuration="2.929932085s" podCreationTimestamp="2025-10-06 08:39:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:07.924689517 +0000 UTC m=+1199.662439538" watchObservedRunningTime="2025-10-06 08:39:07.929932085 +0000 UTC m=+1199.667682106" Oct 06 08:39:07 crc kubenswrapper[4991]: I1006 08:39:07.971018 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.971000043 podStartE2EDuration="3.971000043s" podCreationTimestamp="2025-10-06 08:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:07.966368843 +0000 UTC m=+1199.704118864" watchObservedRunningTime="2025-10-06 08:39:07.971000043 +0000 UTC m=+1199.708750054" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.792159 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-s5hfs"] Oct 06 08:39:08 crc kubenswrapper[4991]: E1006 08:39:08.792914 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd94528-deb5-46b2-b5c5-5aba9b33b05d" containerName="init" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.792938 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd94528-deb5-46b2-b5c5-5aba9b33b05d" containerName="init" Oct 06 08:39:08 crc kubenswrapper[4991]: E1006 08:39:08.792974 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd94528-deb5-46b2-b5c5-5aba9b33b05d" containerName="dnsmasq-dns" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.792985 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd94528-deb5-46b2-b5c5-5aba9b33b05d" containerName="dnsmasq-dns" Oct 06 08:39:08 crc kubenswrapper[4991]: E1006 08:39:08.793004 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adc11447-fe09-4d10-9d49-c064f5fffc7d" containerName="mariadb-account-create" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.793013 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="adc11447-fe09-4d10-9d49-c064f5fffc7d" containerName="mariadb-account-create" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.793227 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd94528-deb5-46b2-b5c5-5aba9b33b05d" containerName="dnsmasq-dns" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.793256 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="adc11447-fe09-4d10-9d49-c064f5fffc7d" containerName="mariadb-account-create" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.794127 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.795263 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ad2fab6-f115-4ab3-b631-242ef3474da2-config\") pod \"neutron-db-sync-s5hfs\" (UID: \"8ad2fab6-f115-4ab3-b631-242ef3474da2\") " pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.795398 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad2fab6-f115-4ab3-b631-242ef3474da2-combined-ca-bundle\") pod \"neutron-db-sync-s5hfs\" (UID: \"8ad2fab6-f115-4ab3-b631-242ef3474da2\") " pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.795464 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q56zc\" (UniqueName: \"kubernetes.io/projected/8ad2fab6-f115-4ab3-b631-242ef3474da2-kube-api-access-q56zc\") pod \"neutron-db-sync-s5hfs\" (UID: \"8ad2fab6-f115-4ab3-b631-242ef3474da2\") " pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.797371 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.797947 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5cb9d" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.805318 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.814124 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-s5hfs"] Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.896893 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad2fab6-f115-4ab3-b631-242ef3474da2-combined-ca-bundle\") pod \"neutron-db-sync-s5hfs\" (UID: \"8ad2fab6-f115-4ab3-b631-242ef3474da2\") " pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.896974 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q56zc\" (UniqueName: \"kubernetes.io/projected/8ad2fab6-f115-4ab3-b631-242ef3474da2-kube-api-access-q56zc\") pod \"neutron-db-sync-s5hfs\" (UID: \"8ad2fab6-f115-4ab3-b631-242ef3474da2\") " pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.897085 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ad2fab6-f115-4ab3-b631-242ef3474da2-config\") pod \"neutron-db-sync-s5hfs\" (UID: \"8ad2fab6-f115-4ab3-b631-242ef3474da2\") " pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.915592 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ad2fab6-f115-4ab3-b631-242ef3474da2-config\") pod \"neutron-db-sync-s5hfs\" (UID: \"8ad2fab6-f115-4ab3-b631-242ef3474da2\") " pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.925576 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad2fab6-f115-4ab3-b631-242ef3474da2-combined-ca-bundle\") pod \"neutron-db-sync-s5hfs\" (UID: \"8ad2fab6-f115-4ab3-b631-242ef3474da2\") " pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:08 crc kubenswrapper[4991]: I1006 08:39:08.966762 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q56zc\" (UniqueName: \"kubernetes.io/projected/8ad2fab6-f115-4ab3-b631-242ef3474da2-kube-api-access-q56zc\") pod \"neutron-db-sync-s5hfs\" (UID: \"8ad2fab6-f115-4ab3-b631-242ef3474da2\") " pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:09 crc kubenswrapper[4991]: I1006 08:39:09.113567 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.679773 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4mpdq" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.835937 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-scripts\") pod \"d139f7e8-c126-43bf-9a26-7692b455412b\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.836003 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-combined-ca-bundle\") pod \"d139f7e8-c126-43bf-9a26-7692b455412b\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.836070 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d139f7e8-c126-43bf-9a26-7692b455412b-logs\") pod \"d139f7e8-c126-43bf-9a26-7692b455412b\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.836189 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-config-data\") pod \"d139f7e8-c126-43bf-9a26-7692b455412b\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.836240 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6xq8\" (UniqueName: \"kubernetes.io/projected/d139f7e8-c126-43bf-9a26-7692b455412b-kube-api-access-d6xq8\") pod \"d139f7e8-c126-43bf-9a26-7692b455412b\" (UID: \"d139f7e8-c126-43bf-9a26-7692b455412b\") " Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.836626 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d139f7e8-c126-43bf-9a26-7692b455412b-logs" (OuterVolumeSpecName: "logs") pod "d139f7e8-c126-43bf-9a26-7692b455412b" (UID: "d139f7e8-c126-43bf-9a26-7692b455412b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.841506 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d139f7e8-c126-43bf-9a26-7692b455412b-kube-api-access-d6xq8" (OuterVolumeSpecName: "kube-api-access-d6xq8") pod "d139f7e8-c126-43bf-9a26-7692b455412b" (UID: "d139f7e8-c126-43bf-9a26-7692b455412b"). InnerVolumeSpecName "kube-api-access-d6xq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.841582 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-scripts" (OuterVolumeSpecName: "scripts") pod "d139f7e8-c126-43bf-9a26-7692b455412b" (UID: "d139f7e8-c126-43bf-9a26-7692b455412b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.867994 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-config-data" (OuterVolumeSpecName: "config-data") pod "d139f7e8-c126-43bf-9a26-7692b455412b" (UID: "d139f7e8-c126-43bf-9a26-7692b455412b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.880803 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d139f7e8-c126-43bf-9a26-7692b455412b" (UID: "d139f7e8-c126-43bf-9a26-7692b455412b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.940580 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d139f7e8-c126-43bf-9a26-7692b455412b-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.940621 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.940637 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6xq8\" (UniqueName: \"kubernetes.io/projected/d139f7e8-c126-43bf-9a26-7692b455412b-kube-api-access-d6xq8\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.940653 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.940666 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d139f7e8-c126-43bf-9a26-7692b455412b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.947253 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4mpdq" event={"ID":"d139f7e8-c126-43bf-9a26-7692b455412b","Type":"ContainerDied","Data":"c9dd7fd1b1482bc75371e10e6f5e7ef3e3f85c0927abad773f6305372c67dda9"} Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.947351 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9dd7fd1b1482bc75371e10e6f5e7ef3e3f85c0927abad773f6305372c67dda9" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.947271 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4mpdq" Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.949199 4991 generic.go:334] "Generic (PLEG): container finished" podID="c0dc26d4-eaf8-4419-a5e9-82e40496890b" containerID="3664b0b86ced009ed293faa237b64fa88b76a10da99303a55d6b375dde2bab1c" exitCode=0 Oct 06 08:39:10 crc kubenswrapper[4991]: I1006 08:39:10.949274 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7pz2m" event={"ID":"c0dc26d4-eaf8-4419-a5e9-82e40496890b","Type":"ContainerDied","Data":"3664b0b86ced009ed293faa237b64fa88b76a10da99303a55d6b375dde2bab1c"} Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.627348 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-s5hfs"] Oct 06 08:39:11 crc kubenswrapper[4991]: W1006 08:39:11.638458 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ad2fab6_f115_4ab3_b631_242ef3474da2.slice/crio-004b2ffb9c7886018ea8114a4e2c64b27b31b2c55a116049495834bacf8e2ed9 WatchSource:0}: Error finding container 004b2ffb9c7886018ea8114a4e2c64b27b31b2c55a116049495834bacf8e2ed9: Status 404 returned error can't find the container with id 004b2ffb9c7886018ea8114a4e2c64b27b31b2c55a116049495834bacf8e2ed9 Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.851073 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b98fcbb5b-2m256"] Oct 06 08:39:11 crc kubenswrapper[4991]: E1006 08:39:11.852569 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d139f7e8-c126-43bf-9a26-7692b455412b" containerName="placement-db-sync" Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.852592 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d139f7e8-c126-43bf-9a26-7692b455412b" containerName="placement-db-sync" Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.853322 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d139f7e8-c126-43bf-9a26-7692b455412b" containerName="placement-db-sync" Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.855064 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.862002 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.862349 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.862420 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7z8nn" Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.862654 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.862349 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.884196 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b98fcbb5b-2m256"] Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.961249 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s5hfs" event={"ID":"8ad2fab6-f115-4ab3-b631-242ef3474da2","Type":"ContainerStarted","Data":"c8554f9b2917b9400926d1608cbf5f4f2c2d666a21fe6417cd6a5eadb0c003c4"} Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.961674 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s5hfs" event={"ID":"8ad2fab6-f115-4ab3-b631-242ef3474da2","Type":"ContainerStarted","Data":"004b2ffb9c7886018ea8114a4e2c64b27b31b2c55a116049495834bacf8e2ed9"} Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.966273 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5sltb" event={"ID":"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5","Type":"ContainerStarted","Data":"4b198d47b63faa11ab3269678b4d8f8709c7776bf98c9f43b32df455e70fc098"} Oct 06 08:39:11 crc kubenswrapper[4991]: I1006 08:39:11.986264 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-s5hfs" podStartSLOduration=3.986241889 podStartE2EDuration="3.986241889s" podCreationTimestamp="2025-10-06 08:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:11.976102923 +0000 UTC m=+1203.713852944" watchObservedRunningTime="2025-10-06 08:39:11.986241889 +0000 UTC m=+1203.723991910" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.005716 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-5sltb" podStartSLOduration=2.185325136 podStartE2EDuration="9.005695948s" podCreationTimestamp="2025-10-06 08:39:03 +0000 UTC" firstStartedPulling="2025-10-06 08:39:04.412138235 +0000 UTC m=+1196.149888256" lastFinishedPulling="2025-10-06 08:39:11.232509047 +0000 UTC m=+1202.970259068" observedRunningTime="2025-10-06 08:39:11.995363816 +0000 UTC m=+1203.733113837" watchObservedRunningTime="2025-10-06 08:39:12.005695948 +0000 UTC m=+1203.743445969" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.011792 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-internal-tls-certs\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.011895 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-scripts\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.011924 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-public-tls-certs\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.011953 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-config-data\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.012007 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-combined-ca-bundle\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.012038 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feb6a9a7-403e-4dc9-903c-349391d84efb-logs\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.012090 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdf7w\" (UniqueName: \"kubernetes.io/projected/feb6a9a7-403e-4dc9-903c-349391d84efb-kube-api-access-xdf7w\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.114667 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-scripts\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.114724 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-public-tls-certs\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.114754 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-config-data\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.114806 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-combined-ca-bundle\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.114837 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feb6a9a7-403e-4dc9-903c-349391d84efb-logs\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.114926 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdf7w\" (UniqueName: \"kubernetes.io/projected/feb6a9a7-403e-4dc9-903c-349391d84efb-kube-api-access-xdf7w\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.115323 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-internal-tls-certs\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.124690 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feb6a9a7-403e-4dc9-903c-349391d84efb-logs\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.127004 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-config-data\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.127365 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-combined-ca-bundle\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.127804 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-internal-tls-certs\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.131620 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-public-tls-certs\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.131620 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-scripts\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.139239 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdf7w\" (UniqueName: \"kubernetes.io/projected/feb6a9a7-403e-4dc9-903c-349391d84efb-kube-api-access-xdf7w\") pod \"placement-6b98fcbb5b-2m256\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.223062 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.292052 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.424979 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6k4\" (UniqueName: \"kubernetes.io/projected/c0dc26d4-eaf8-4419-a5e9-82e40496890b-kube-api-access-6g6k4\") pod \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.425575 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-combined-ca-bundle\") pod \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.425627 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-fernet-keys\") pod \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.425642 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-scripts\") pod \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.427495 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-credential-keys\") pod \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.427604 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-config-data\") pod \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\" (UID: \"c0dc26d4-eaf8-4419-a5e9-82e40496890b\") " Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.432103 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0dc26d4-eaf8-4419-a5e9-82e40496890b-kube-api-access-6g6k4" (OuterVolumeSpecName: "kube-api-access-6g6k4") pod "c0dc26d4-eaf8-4419-a5e9-82e40496890b" (UID: "c0dc26d4-eaf8-4419-a5e9-82e40496890b"). InnerVolumeSpecName "kube-api-access-6g6k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.434752 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c0dc26d4-eaf8-4419-a5e9-82e40496890b" (UID: "c0dc26d4-eaf8-4419-a5e9-82e40496890b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.436384 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-scripts" (OuterVolumeSpecName: "scripts") pod "c0dc26d4-eaf8-4419-a5e9-82e40496890b" (UID: "c0dc26d4-eaf8-4419-a5e9-82e40496890b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.438416 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c0dc26d4-eaf8-4419-a5e9-82e40496890b" (UID: "c0dc26d4-eaf8-4419-a5e9-82e40496890b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.457529 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0dc26d4-eaf8-4419-a5e9-82e40496890b" (UID: "c0dc26d4-eaf8-4419-a5e9-82e40496890b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.467388 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-config-data" (OuterVolumeSpecName: "config-data") pod "c0dc26d4-eaf8-4419-a5e9-82e40496890b" (UID: "c0dc26d4-eaf8-4419-a5e9-82e40496890b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.531116 4991 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.531158 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.531174 4991 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.531182 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.531192 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6k4\" (UniqueName: \"kubernetes.io/projected/c0dc26d4-eaf8-4419-a5e9-82e40496890b-kube-api-access-6g6k4\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.531203 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0dc26d4-eaf8-4419-a5e9-82e40496890b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.691117 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b98fcbb5b-2m256"] Oct 06 08:39:12 crc kubenswrapper[4991]: W1006 08:39:12.701716 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeb6a9a7_403e_4dc9_903c_349391d84efb.slice/crio-f68fcaa96a538101b5a2515f8d7dbd7ca1052eab89f2d5473e684b8fded6fc0d WatchSource:0}: Error finding container f68fcaa96a538101b5a2515f8d7dbd7ca1052eab89f2d5473e684b8fded6fc0d: Status 404 returned error can't find the container with id f68fcaa96a538101b5a2515f8d7dbd7ca1052eab89f2d5473e684b8fded6fc0d Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.976910 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7pz2m" event={"ID":"c0dc26d4-eaf8-4419-a5e9-82e40496890b","Type":"ContainerDied","Data":"99f6f104a5428a79376e8e2d5e69f537cc7a547868dbc8aee5f68a79a522fd1e"} Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.977236 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99f6f104a5428a79376e8e2d5e69f537cc7a547868dbc8aee5f68a79a522fd1e" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.976930 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7pz2m" Oct 06 08:39:12 crc kubenswrapper[4991]: I1006 08:39:12.978497 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b98fcbb5b-2m256" event={"ID":"feb6a9a7-403e-4dc9-903c-349391d84efb","Type":"ContainerStarted","Data":"f68fcaa96a538101b5a2515f8d7dbd7ca1052eab89f2d5473e684b8fded6fc0d"} Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.062811 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-774597bb4-6c42q"] Oct 06 08:39:13 crc kubenswrapper[4991]: E1006 08:39:13.067390 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0dc26d4-eaf8-4419-a5e9-82e40496890b" containerName="keystone-bootstrap" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.067445 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0dc26d4-eaf8-4419-a5e9-82e40496890b" containerName="keystone-bootstrap" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.067725 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0dc26d4-eaf8-4419-a5e9-82e40496890b" containerName="keystone-bootstrap" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.074044 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.074534 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-774597bb4-6c42q"] Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.077284 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.077775 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.077970 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.078114 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-45scd" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.078253 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.082100 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.246640 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-public-tls-certs\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.246723 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4zj7\" (UniqueName: \"kubernetes.io/projected/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-kube-api-access-d4zj7\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.246808 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-combined-ca-bundle\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.246840 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-fernet-keys\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.247037 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-internal-tls-certs\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.247116 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-config-data\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.247186 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-scripts\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.247221 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-credential-keys\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.355760 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-config-data\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.355900 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-scripts\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.355933 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-credential-keys\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.356044 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-public-tls-certs\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.356106 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4zj7\" (UniqueName: \"kubernetes.io/projected/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-kube-api-access-d4zj7\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.356291 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-combined-ca-bundle\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.356931 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-fernet-keys\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.356981 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-internal-tls-certs\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.362459 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-scripts\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.362535 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-credential-keys\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.363007 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-fernet-keys\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.363014 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-public-tls-certs\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.364781 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-combined-ca-bundle\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.368888 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-config-data\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.373666 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4zj7\" (UniqueName: \"kubernetes.io/projected/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-kube-api-access-d4zj7\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.373718 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-internal-tls-certs\") pod \"keystone-774597bb4-6c42q\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:13 crc kubenswrapper[4991]: I1006 08:39:13.394666 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:15 crc kubenswrapper[4991]: I1006 08:39:14.999865 4991 generic.go:334] "Generic (PLEG): container finished" podID="85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5" containerID="4b198d47b63faa11ab3269678b4d8f8709c7776bf98c9f43b32df455e70fc098" exitCode=0 Oct 06 08:39:15 crc kubenswrapper[4991]: I1006 08:39:14.999977 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5sltb" event={"ID":"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5","Type":"ContainerDied","Data":"4b198d47b63faa11ab3269678b4d8f8709c7776bf98c9f43b32df455e70fc098"} Oct 06 08:39:15 crc kubenswrapper[4991]: I1006 08:39:15.188497 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 08:39:15 crc kubenswrapper[4991]: I1006 08:39:15.188894 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 08:39:15 crc kubenswrapper[4991]: I1006 08:39:15.222255 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 08:39:15 crc kubenswrapper[4991]: I1006 08:39:15.272032 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:15 crc kubenswrapper[4991]: I1006 08:39:15.272085 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:15 crc kubenswrapper[4991]: I1006 08:39:15.272156 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 08:39:15 crc kubenswrapper[4991]: I1006 08:39:15.321130 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:15 crc kubenswrapper[4991]: I1006 08:39:15.321624 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:16 crc kubenswrapper[4991]: I1006 08:39:16.011587 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b98fcbb5b-2m256" event={"ID":"feb6a9a7-403e-4dc9-903c-349391d84efb","Type":"ContainerStarted","Data":"40cc2581ab3ca423c98e61d01fbf933e125eced21752dcff956d71eaf1890135"} Oct 06 08:39:16 crc kubenswrapper[4991]: I1006 08:39:16.012205 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 08:39:16 crc kubenswrapper[4991]: I1006 08:39:16.012277 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:16 crc kubenswrapper[4991]: I1006 08:39:16.012306 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:16 crc kubenswrapper[4991]: I1006 08:39:16.012317 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 08:39:18 crc kubenswrapper[4991]: I1006 08:39:18.027855 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:39:18 crc kubenswrapper[4991]: I1006 08:39:18.028358 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:39:18 crc kubenswrapper[4991]: I1006 08:39:18.030783 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:18 crc kubenswrapper[4991]: I1006 08:39:18.084977 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 08:39:18 crc kubenswrapper[4991]: I1006 08:39:18.091074 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 08:39:18 crc kubenswrapper[4991]: I1006 08:39:18.091188 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:39:18 crc kubenswrapper[4991]: I1006 08:39:18.125528 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 08:39:25 crc kubenswrapper[4991]: I1006 08:39:25.218705 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:25 crc kubenswrapper[4991]: I1006 08:39:25.330852 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-db-sync-config-data\") pod \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\" (UID: \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\") " Oct 06 08:39:25 crc kubenswrapper[4991]: I1006 08:39:25.330938 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-combined-ca-bundle\") pod \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\" (UID: \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\") " Oct 06 08:39:25 crc kubenswrapper[4991]: I1006 08:39:25.330965 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29pdv\" (UniqueName: \"kubernetes.io/projected/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-kube-api-access-29pdv\") pod \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\" (UID: \"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5\") " Oct 06 08:39:25 crc kubenswrapper[4991]: I1006 08:39:25.336991 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5" (UID: "85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:25 crc kubenswrapper[4991]: I1006 08:39:25.337238 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-kube-api-access-29pdv" (OuterVolumeSpecName: "kube-api-access-29pdv") pod "85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5" (UID: "85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5"). InnerVolumeSpecName "kube-api-access-29pdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:25 crc kubenswrapper[4991]: I1006 08:39:25.359173 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5" (UID: "85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:25 crc kubenswrapper[4991]: I1006 08:39:25.433654 4991 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:25 crc kubenswrapper[4991]: I1006 08:39:25.433681 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:25 crc kubenswrapper[4991]: I1006 08:39:25.433708 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29pdv\" (UniqueName: \"kubernetes.io/projected/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5-kube-api-access-29pdv\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.102052 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5sltb" event={"ID":"85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5","Type":"ContainerDied","Data":"835a71678b6487101d76f4781d5e48fe1b8c822b38c065fc4637078b65db9ccf"} Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.102359 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="835a71678b6487101d76f4781d5e48fe1b8c822b38c065fc4637078b65db9ccf" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.102435 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5sltb" Oct 06 08:39:26 crc kubenswrapper[4991]: E1006 08:39:26.391733 4991 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 06 08:39:26 crc kubenswrapper[4991]: E1006 08:39:26.391892 4991 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r9b2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-52tpz_openstack(5ae62f13-d5be-414e-a6f9-9b2e475afbd1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 08:39:26 crc kubenswrapper[4991]: E1006 08:39:26.393527 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-52tpz" podUID="5ae62f13-d5be-414e-a6f9-9b2e475afbd1" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.402809 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-79798cd5c5-jz6kb"] Oct 06 08:39:26 crc kubenswrapper[4991]: E1006 08:39:26.403198 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5" containerName="barbican-db-sync" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.403215 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5" containerName="barbican-db-sync" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.403393 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5" containerName="barbican-db-sync" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.404255 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.424287 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.424374 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.424559 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-snz5j" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.432524 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-75c547987d-brwwk"] Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.434057 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.435908 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.446389 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-79798cd5c5-jz6kb"] Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.473463 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75c547987d-brwwk"] Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.556697 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2720ee8-eb06-4a0b-9bee-153b69ee769e-logs\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.556938 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-combined-ca-bundle\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.556986 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-combined-ca-bundle\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.557021 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-config-data\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.557051 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-config-data-custom\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.557081 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4t26\" (UniqueName: \"kubernetes.io/projected/ab7f3760-250c-4e34-8bde-7e9218b711ff-kube-api-access-m4t26\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.557099 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-config-data-custom\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.557116 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwtt8\" (UniqueName: \"kubernetes.io/projected/b2720ee8-eb06-4a0b-9bee-153b69ee769e-kube-api-access-cwtt8\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.557139 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab7f3760-250c-4e34-8bde-7e9218b711ff-logs\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.557195 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-config-data\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.581094 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-mchn7"] Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.582519 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.641883 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-mchn7"] Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660055 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660121 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-config-data\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660327 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqpm2\" (UniqueName: \"kubernetes.io/projected/267689f6-2a4c-4996-bd55-61ecc644a5b9-kube-api-access-tqpm2\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660390 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2720ee8-eb06-4a0b-9bee-153b69ee769e-logs\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660413 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-combined-ca-bundle\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660474 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660529 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-combined-ca-bundle\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660581 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660609 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-config-data\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660629 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-config\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660681 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-config-data-custom\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660697 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660744 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4t26\" (UniqueName: \"kubernetes.io/projected/ab7f3760-250c-4e34-8bde-7e9218b711ff-kube-api-access-m4t26\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660768 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-config-data-custom\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660788 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwtt8\" (UniqueName: \"kubernetes.io/projected/b2720ee8-eb06-4a0b-9bee-153b69ee769e-kube-api-access-cwtt8\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.660818 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab7f3760-250c-4e34-8bde-7e9218b711ff-logs\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.661858 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab7f3760-250c-4e34-8bde-7e9218b711ff-logs\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.670437 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-combined-ca-bundle\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.670777 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-config-data-custom\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.670924 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-config-data\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.671197 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2720ee8-eb06-4a0b-9bee-153b69ee769e-logs\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.671659 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-config-data-custom\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.671946 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-69b6ff7f94-j8svl"] Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.673508 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.675087 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.685022 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69b6ff7f94-j8svl"] Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.687326 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-config-data\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.687933 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-combined-ca-bundle\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.695716 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4t26\" (UniqueName: \"kubernetes.io/projected/ab7f3760-250c-4e34-8bde-7e9218b711ff-kube-api-access-m4t26\") pod \"barbican-keystone-listener-75c547987d-brwwk\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.698864 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwtt8\" (UniqueName: \"kubernetes.io/projected/b2720ee8-eb06-4a0b-9bee-153b69ee769e-kube-api-access-cwtt8\") pod \"barbican-worker-79798cd5c5-jz6kb\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.763573 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-combined-ca-bundle\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.764157 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqpm2\" (UniqueName: \"kubernetes.io/projected/267689f6-2a4c-4996-bd55-61ecc644a5b9-kube-api-access-tqpm2\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.764263 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.764378 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.764415 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-config\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.765196 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.765454 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.765465 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.766327 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-config\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.766791 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-config-data-custom\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.766851 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.766914 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc64x\" (UniqueName: \"kubernetes.io/projected/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-kube-api-access-zc64x\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.766955 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-logs\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.766961 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.766993 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-config-data\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.767505 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.792136 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqpm2\" (UniqueName: \"kubernetes.io/projected/267689f6-2a4c-4996-bd55-61ecc644a5b9-kube-api-access-tqpm2\") pod \"dnsmasq-dns-6d66f584d7-mchn7\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.806282 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.826589 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-774597bb4-6c42q"] Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.834934 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.868041 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-combined-ca-bundle\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.868178 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-config-data-custom\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.868226 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc64x\" (UniqueName: \"kubernetes.io/projected/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-kube-api-access-zc64x\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.868247 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-logs\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.868268 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-config-data\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.868851 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-logs\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.871761 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-config-data-custom\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.872408 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-combined-ca-bundle\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.872795 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-config-data\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.889822 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc64x\" (UniqueName: \"kubernetes.io/projected/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-kube-api-access-zc64x\") pod \"barbican-api-69b6ff7f94-j8svl\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:26 crc kubenswrapper[4991]: I1006 08:39:26.915259 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:27 crc kubenswrapper[4991]: I1006 08:39:27.085954 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-79798cd5c5-jz6kb"] Oct 06 08:39:27 crc kubenswrapper[4991]: I1006 08:39:27.104669 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:27 crc kubenswrapper[4991]: I1006 08:39:27.194227 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-774597bb4-6c42q" event={"ID":"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb","Type":"ContainerStarted","Data":"f68568dfb392c06fa1a40abaa08c31d698aef3315cd0b120a40469dd179eccab"} Oct 06 08:39:27 crc kubenswrapper[4991]: I1006 08:39:27.238088 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b98fcbb5b-2m256" event={"ID":"feb6a9a7-403e-4dc9-903c-349391d84efb","Type":"ContainerStarted","Data":"fb10a7813dbd9816182db52dd9a3b501c4c766c110a2193adb6f6007214cdc4f"} Oct 06 08:39:27 crc kubenswrapper[4991]: I1006 08:39:27.239401 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:27 crc kubenswrapper[4991]: I1006 08:39:27.239605 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:27 crc kubenswrapper[4991]: I1006 08:39:27.293706 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6b98fcbb5b-2m256" podStartSLOduration=16.293685658 podStartE2EDuration="16.293685658s" podCreationTimestamp="2025-10-06 08:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:27.273755596 +0000 UTC m=+1219.011505627" watchObservedRunningTime="2025-10-06 08:39:27.293685658 +0000 UTC m=+1219.031435679" Oct 06 08:39:27 crc kubenswrapper[4991]: I1006 08:39:27.306356 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7","Type":"ContainerStarted","Data":"b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e"} Oct 06 08:39:27 crc kubenswrapper[4991]: E1006 08:39:27.322087 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-52tpz" podUID="5ae62f13-d5be-414e-a6f9-9b2e475afbd1" Oct 06 08:39:27 crc kubenswrapper[4991]: I1006 08:39:27.581713 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75c547987d-brwwk"] Oct 06 08:39:27 crc kubenswrapper[4991]: I1006 08:39:27.663391 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69b6ff7f94-j8svl"] Oct 06 08:39:27 crc kubenswrapper[4991]: I1006 08:39:27.683863 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-mchn7"] Oct 06 08:39:27 crc kubenswrapper[4991]: W1006 08:39:27.714348 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod267689f6_2a4c_4996_bd55_61ecc644a5b9.slice/crio-50bc356faee11f0d00deab95619e3e6db86833191ae26f6caaa1c4a869387a18 WatchSource:0}: Error finding container 50bc356faee11f0d00deab95619e3e6db86833191ae26f6caaa1c4a869387a18: Status 404 returned error can't find the container with id 50bc356faee11f0d00deab95619e3e6db86833191ae26f6caaa1c4a869387a18 Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.313132 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b6ff7f94-j8svl" event={"ID":"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183","Type":"ContainerStarted","Data":"59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1"} Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.313436 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b6ff7f94-j8svl" event={"ID":"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183","Type":"ContainerStarted","Data":"766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e"} Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.313452 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b6ff7f94-j8svl" event={"ID":"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183","Type":"ContainerStarted","Data":"845bbf393ade326038caa2d545517f93cb7d38386f9ab9ae3be86e02500045a9"} Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.313492 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.313514 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.315205 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" event={"ID":"ab7f3760-250c-4e34-8bde-7e9218b711ff","Type":"ContainerStarted","Data":"a9a6952d5ef0cdd5ded874846bc65015cacc3696b9a2c51c948ab0619ed3b799"} Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.317158 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-774597bb4-6c42q" event={"ID":"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb","Type":"ContainerStarted","Data":"0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0"} Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.317321 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.318570 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-79798cd5c5-jz6kb" event={"ID":"b2720ee8-eb06-4a0b-9bee-153b69ee769e","Type":"ContainerStarted","Data":"c722d321ae9a78f88e82287a1583ca6ae7044e2e4bb3a86e94b4936e702b1a57"} Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.326277 4991 generic.go:334] "Generic (PLEG): container finished" podID="267689f6-2a4c-4996-bd55-61ecc644a5b9" containerID="6f48c10672832be4a8045cde725a38924d95a8afded3d2328e80bb28744bf478" exitCode=0 Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.326577 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" event={"ID":"267689f6-2a4c-4996-bd55-61ecc644a5b9","Type":"ContainerDied","Data":"6f48c10672832be4a8045cde725a38924d95a8afded3d2328e80bb28744bf478"} Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.326627 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" event={"ID":"267689f6-2a4c-4996-bd55-61ecc644a5b9","Type":"ContainerStarted","Data":"50bc356faee11f0d00deab95619e3e6db86833191ae26f6caaa1c4a869387a18"} Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.359641 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-69b6ff7f94-j8svl" podStartSLOduration=2.359616343 podStartE2EDuration="2.359616343s" podCreationTimestamp="2025-10-06 08:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:28.333451676 +0000 UTC m=+1220.071201707" watchObservedRunningTime="2025-10-06 08:39:28.359616343 +0000 UTC m=+1220.097366364" Oct 06 08:39:28 crc kubenswrapper[4991]: I1006 08:39:28.396752 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-774597bb4-6c42q" podStartSLOduration=15.396734557 podStartE2EDuration="15.396734557s" podCreationTimestamp="2025-10-06 08:39:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:28.357769052 +0000 UTC m=+1220.095519073" watchObservedRunningTime="2025-10-06 08:39:28.396734557 +0000 UTC m=+1220.134484578" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.011468 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.341390 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" event={"ID":"267689f6-2a4c-4996-bd55-61ecc644a5b9","Type":"ContainerStarted","Data":"27cdebf646172eb8c2d13bbd44f32fd1184730d2d7a9115c1e40efa64977d0dd"} Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.363433 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" podStartSLOduration=3.363412868 podStartE2EDuration="3.363412868s" podCreationTimestamp="2025-10-06 08:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:29.355613793 +0000 UTC m=+1221.093363814" watchObservedRunningTime="2025-10-06 08:39:29.363412868 +0000 UTC m=+1221.101162899" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.391917 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-548cc795f4-8m4d9"] Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.393375 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.398008 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.411264 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.426402 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-548cc795f4-8m4d9"] Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.453398 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc554\" (UniqueName: \"kubernetes.io/projected/a9be32ba-d183-4fd5-ba8b-63f79c973c81-kube-api-access-hc554\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.453543 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-config-data-custom\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.453563 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9be32ba-d183-4fd5-ba8b-63f79c973c81-logs\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.453651 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-config-data\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.453711 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-combined-ca-bundle\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.453759 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-internal-tls-certs\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.453831 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-public-tls-certs\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.585424 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-combined-ca-bundle\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.585647 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-internal-tls-certs\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.585737 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-public-tls-certs\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.585856 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc554\" (UniqueName: \"kubernetes.io/projected/a9be32ba-d183-4fd5-ba8b-63f79c973c81-kube-api-access-hc554\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.585981 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-config-data-custom\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.586009 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9be32ba-d183-4fd5-ba8b-63f79c973c81-logs\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.586090 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-config-data\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.589524 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9be32ba-d183-4fd5-ba8b-63f79c973c81-logs\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.593553 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-config-data\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.593949 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-combined-ca-bundle\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.596534 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-public-tls-certs\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.597058 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-config-data-custom\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.607024 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-internal-tls-certs\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.611927 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc554\" (UniqueName: \"kubernetes.io/projected/a9be32ba-d183-4fd5-ba8b-63f79c973c81-kube-api-access-hc554\") pod \"barbican-api-548cc795f4-8m4d9\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:29 crc kubenswrapper[4991]: I1006 08:39:29.709950 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:30 crc kubenswrapper[4991]: I1006 08:39:30.350645 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:30 crc kubenswrapper[4991]: I1006 08:39:30.749614 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-548cc795f4-8m4d9"] Oct 06 08:39:30 crc kubenswrapper[4991]: W1006 08:39:30.760490 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9be32ba_d183_4fd5_ba8b_63f79c973c81.slice/crio-773d7d55a52709c57ac795cd9a5c5b4cc33c328c660fb36301f52c84e195de8a WatchSource:0}: Error finding container 773d7d55a52709c57ac795cd9a5c5b4cc33c328c660fb36301f52c84e195de8a: Status 404 returned error can't find the container with id 773d7d55a52709c57ac795cd9a5c5b4cc33c328c660fb36301f52c84e195de8a Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.368360 4991 generic.go:334] "Generic (PLEG): container finished" podID="8ad2fab6-f115-4ab3-b631-242ef3474da2" containerID="c8554f9b2917b9400926d1608cbf5f4f2c2d666a21fe6417cd6a5eadb0c003c4" exitCode=0 Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.368428 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s5hfs" event={"ID":"8ad2fab6-f115-4ab3-b631-242ef3474da2","Type":"ContainerDied","Data":"c8554f9b2917b9400926d1608cbf5f4f2c2d666a21fe6417cd6a5eadb0c003c4"} Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.372190 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-548cc795f4-8m4d9" event={"ID":"a9be32ba-d183-4fd5-ba8b-63f79c973c81","Type":"ContainerStarted","Data":"0c70b35b1a4450b4db02a166e4cb0db2437a7fcc554b453e7d86b3f8efc7685d"} Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.372233 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-548cc795f4-8m4d9" event={"ID":"a9be32ba-d183-4fd5-ba8b-63f79c973c81","Type":"ContainerStarted","Data":"ed3d4866db94527f98aa6062572670cd20f71dc34b4e9fe3ca2ccfae1b03bda2"} Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.372246 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-548cc795f4-8m4d9" event={"ID":"a9be32ba-d183-4fd5-ba8b-63f79c973c81","Type":"ContainerStarted","Data":"773d7d55a52709c57ac795cd9a5c5b4cc33c328c660fb36301f52c84e195de8a"} Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.372350 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.373921 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" event={"ID":"ab7f3760-250c-4e34-8bde-7e9218b711ff","Type":"ContainerStarted","Data":"4410132c0aa760f431a4973b217154eb03f1a1acbcd426c9c298f0ff9b2290ca"} Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.373955 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" event={"ID":"ab7f3760-250c-4e34-8bde-7e9218b711ff","Type":"ContainerStarted","Data":"118c1de5d5587e621349f796c4342c648e5951d232a1d0e3dcb3fd1f0b4f7705"} Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.377611 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-79798cd5c5-jz6kb" event={"ID":"b2720ee8-eb06-4a0b-9bee-153b69ee769e","Type":"ContainerStarted","Data":"d85323f86704585d0954acacab967e959246d8405dc02badbe6e793e45cbe71b"} Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.377818 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-79798cd5c5-jz6kb" event={"ID":"b2720ee8-eb06-4a0b-9bee-153b69ee769e","Type":"ContainerStarted","Data":"6cff5f4fbbdd1f8906fc62c1c37a25292d4484752ab929ce099738e8a4117501"} Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.411156 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-548cc795f4-8m4d9" podStartSLOduration=2.41113444 podStartE2EDuration="2.41113444s" podCreationTimestamp="2025-10-06 08:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:31.408716473 +0000 UTC m=+1223.146466494" watchObservedRunningTime="2025-10-06 08:39:31.41113444 +0000 UTC m=+1223.148884481" Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.434513 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" podStartSLOduration=2.734927977 podStartE2EDuration="5.434491693s" podCreationTimestamp="2025-10-06 08:39:26 +0000 UTC" firstStartedPulling="2025-10-06 08:39:27.594573562 +0000 UTC m=+1219.332323583" lastFinishedPulling="2025-10-06 08:39:30.294137278 +0000 UTC m=+1222.031887299" observedRunningTime="2025-10-06 08:39:31.425158207 +0000 UTC m=+1223.162908248" watchObservedRunningTime="2025-10-06 08:39:31.434491693 +0000 UTC m=+1223.172241714" Oct 06 08:39:31 crc kubenswrapper[4991]: I1006 08:39:31.449809 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-79798cd5c5-jz6kb" podStartSLOduration=2.47972982 podStartE2EDuration="5.449784934s" podCreationTimestamp="2025-10-06 08:39:26 +0000 UTC" firstStartedPulling="2025-10-06 08:39:27.32213919 +0000 UTC m=+1219.059889211" lastFinishedPulling="2025-10-06 08:39:30.292194304 +0000 UTC m=+1222.029944325" observedRunningTime="2025-10-06 08:39:31.44452418 +0000 UTC m=+1223.182274221" watchObservedRunningTime="2025-10-06 08:39:31.449784934 +0000 UTC m=+1223.187534955" Oct 06 08:39:32 crc kubenswrapper[4991]: I1006 08:39:32.386566 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:33 crc kubenswrapper[4991]: I1006 08:39:33.892713 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.002608 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.090946 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ad2fab6-f115-4ab3-b631-242ef3474da2-config\") pod \"8ad2fab6-f115-4ab3-b631-242ef3474da2\" (UID: \"8ad2fab6-f115-4ab3-b631-242ef3474da2\") " Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.091024 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q56zc\" (UniqueName: \"kubernetes.io/projected/8ad2fab6-f115-4ab3-b631-242ef3474da2-kube-api-access-q56zc\") pod \"8ad2fab6-f115-4ab3-b631-242ef3474da2\" (UID: \"8ad2fab6-f115-4ab3-b631-242ef3474da2\") " Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.091248 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad2fab6-f115-4ab3-b631-242ef3474da2-combined-ca-bundle\") pod \"8ad2fab6-f115-4ab3-b631-242ef3474da2\" (UID: \"8ad2fab6-f115-4ab3-b631-242ef3474da2\") " Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.129214 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad2fab6-f115-4ab3-b631-242ef3474da2-kube-api-access-q56zc" (OuterVolumeSpecName: "kube-api-access-q56zc") pod "8ad2fab6-f115-4ab3-b631-242ef3474da2" (UID: "8ad2fab6-f115-4ab3-b631-242ef3474da2"). InnerVolumeSpecName "kube-api-access-q56zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.132352 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad2fab6-f115-4ab3-b631-242ef3474da2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ad2fab6-f115-4ab3-b631-242ef3474da2" (UID: "8ad2fab6-f115-4ab3-b631-242ef3474da2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.134064 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad2fab6-f115-4ab3-b631-242ef3474da2-config" (OuterVolumeSpecName: "config") pod "8ad2fab6-f115-4ab3-b631-242ef3474da2" (UID: "8ad2fab6-f115-4ab3-b631-242ef3474da2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.193046 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q56zc\" (UniqueName: \"kubernetes.io/projected/8ad2fab6-f115-4ab3-b631-242ef3474da2-kube-api-access-q56zc\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.193073 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ad2fab6-f115-4ab3-b631-242ef3474da2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.193081 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ad2fab6-f115-4ab3-b631-242ef3474da2-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.413109 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-s5hfs" event={"ID":"8ad2fab6-f115-4ab3-b631-242ef3474da2","Type":"ContainerDied","Data":"004b2ffb9c7886018ea8114a4e2c64b27b31b2c55a116049495834bacf8e2ed9"} Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.413398 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="004b2ffb9c7886018ea8114a4e2c64b27b31b2c55a116049495834bacf8e2ed9" Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.413147 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-s5hfs" Oct 06 08:39:35 crc kubenswrapper[4991]: I1006 08:39:35.510355 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.174457 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-mchn7"] Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.174758 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" podUID="267689f6-2a4c-4996-bd55-61ecc644a5b9" containerName="dnsmasq-dns" containerID="cri-o://27cdebf646172eb8c2d13bbd44f32fd1184730d2d7a9115c1e40efa64977d0dd" gracePeriod=10 Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.180552 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.226415 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-t4nb8"] Oct 06 08:39:36 crc kubenswrapper[4991]: E1006 08:39:36.226872 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad2fab6-f115-4ab3-b631-242ef3474da2" containerName="neutron-db-sync" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.226892 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad2fab6-f115-4ab3-b631-242ef3474da2" containerName="neutron-db-sync" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.227075 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad2fab6-f115-4ab3-b631-242ef3474da2" containerName="neutron-db-sync" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.230209 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.251091 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-t4nb8"] Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.323587 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.323677 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.323725 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.323761 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-config\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.323869 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmqn5\" (UniqueName: \"kubernetes.io/projected/dbd747a7-d54f-46ac-9bde-b887a0450f66-kube-api-access-rmqn5\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.323898 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-dns-svc\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.387220 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f9885cd76-4cxdt"] Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.388772 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.399249 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.399741 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.400086 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.401163 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5cb9d" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.425715 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-config\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.425823 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmqn5\" (UniqueName: \"kubernetes.io/projected/dbd747a7-d54f-46ac-9bde-b887a0450f66-kube-api-access-rmqn5\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.425852 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-dns-svc\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.425917 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.425973 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.425990 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.426947 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.427159 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-dns-svc\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.427519 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.427753 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-config\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.428118 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.467951 4991 generic.go:334] "Generic (PLEG): container finished" podID="267689f6-2a4c-4996-bd55-61ecc644a5b9" containerID="27cdebf646172eb8c2d13bbd44f32fd1184730d2d7a9115c1e40efa64977d0dd" exitCode=0 Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.468004 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" event={"ID":"267689f6-2a4c-4996-bd55-61ecc644a5b9","Type":"ContainerDied","Data":"27cdebf646172eb8c2d13bbd44f32fd1184730d2d7a9115c1e40efa64977d0dd"} Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.471997 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f9885cd76-4cxdt"] Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.528215 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-httpd-config\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.528279 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-ovndb-tls-certs\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.528334 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-config\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.528368 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-combined-ca-bundle\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.528410 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctrl9\" (UniqueName: \"kubernetes.io/projected/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-kube-api-access-ctrl9\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.542760 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmqn5\" (UniqueName: \"kubernetes.io/projected/dbd747a7-d54f-46ac-9bde-b887a0450f66-kube-api-access-rmqn5\") pod \"dnsmasq-dns-688c87cc99-t4nb8\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.641627 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-config\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.641681 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-combined-ca-bundle\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.641721 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctrl9\" (UniqueName: \"kubernetes.io/projected/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-kube-api-access-ctrl9\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.641802 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-httpd-config\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.641822 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-ovndb-tls-certs\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.647947 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.651208 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-httpd-config\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.653352 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-config\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.662065 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-combined-ca-bundle\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.664002 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-ovndb-tls-certs\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.673398 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctrl9\" (UniqueName: \"kubernetes.io/projected/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-kube-api-access-ctrl9\") pod \"neutron-6f9885cd76-4cxdt\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.765917 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:36 crc kubenswrapper[4991]: I1006 08:39:36.916194 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" podUID="267689f6-2a4c-4996-bd55-61ecc644a5b9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.152:5353: connect: connection refused" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.115793 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.296832 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-dns-svc\") pod \"267689f6-2a4c-4996-bd55-61ecc644a5b9\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.296911 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-config\") pod \"267689f6-2a4c-4996-bd55-61ecc644a5b9\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.297002 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-ovsdbserver-sb\") pod \"267689f6-2a4c-4996-bd55-61ecc644a5b9\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.297049 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqpm2\" (UniqueName: \"kubernetes.io/projected/267689f6-2a4c-4996-bd55-61ecc644a5b9-kube-api-access-tqpm2\") pod \"267689f6-2a4c-4996-bd55-61ecc644a5b9\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.297149 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-ovsdbserver-nb\") pod \"267689f6-2a4c-4996-bd55-61ecc644a5b9\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.297257 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-dns-swift-storage-0\") pod \"267689f6-2a4c-4996-bd55-61ecc644a5b9\" (UID: \"267689f6-2a4c-4996-bd55-61ecc644a5b9\") " Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.306565 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267689f6-2a4c-4996-bd55-61ecc644a5b9-kube-api-access-tqpm2" (OuterVolumeSpecName: "kube-api-access-tqpm2") pod "267689f6-2a4c-4996-bd55-61ecc644a5b9" (UID: "267689f6-2a4c-4996-bd55-61ecc644a5b9"). InnerVolumeSpecName "kube-api-access-tqpm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.352447 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "267689f6-2a4c-4996-bd55-61ecc644a5b9" (UID: "267689f6-2a4c-4996-bd55-61ecc644a5b9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.366062 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "267689f6-2a4c-4996-bd55-61ecc644a5b9" (UID: "267689f6-2a4c-4996-bd55-61ecc644a5b9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.366434 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "267689f6-2a4c-4996-bd55-61ecc644a5b9" (UID: "267689f6-2a4c-4996-bd55-61ecc644a5b9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.374690 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-config" (OuterVolumeSpecName: "config") pod "267689f6-2a4c-4996-bd55-61ecc644a5b9" (UID: "267689f6-2a4c-4996-bd55-61ecc644a5b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.399576 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.399610 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.399620 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.399628 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.399637 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqpm2\" (UniqueName: \"kubernetes.io/projected/267689f6-2a4c-4996-bd55-61ecc644a5b9-kube-api-access-tqpm2\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.403142 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "267689f6-2a4c-4996-bd55-61ecc644a5b9" (UID: "267689f6-2a4c-4996-bd55-61ecc644a5b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:38 crc kubenswrapper[4991]: W1006 08:39:38.404479 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9816fde_c4d0_4c01_8d09_2af0f4256fd1.slice/crio-ee15d8b8c296c51105a86cd7c291466f6a124af85521f5e19b4cfe08181eba2c WatchSource:0}: Error finding container ee15d8b8c296c51105a86cd7c291466f6a124af85521f5e19b4cfe08181eba2c: Status 404 returned error can't find the container with id ee15d8b8c296c51105a86cd7c291466f6a124af85521f5e19b4cfe08181eba2c Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.414001 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f9885cd76-4cxdt"] Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.439111 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-t4nb8"] Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.501490 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/267689f6-2a4c-4996-bd55-61ecc644a5b9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.511241 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7","Type":"ContainerStarted","Data":"4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0"} Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.511413 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="ceilometer-central-agent" containerID="cri-o://e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651" gracePeriod=30 Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.511673 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.511685 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="proxy-httpd" containerID="cri-o://4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0" gracePeriod=30 Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.511725 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="ceilometer-notification-agent" containerID="cri-o://b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7" gracePeriod=30 Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.511749 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="sg-core" containerID="cri-o://b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e" gracePeriod=30 Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.524865 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" event={"ID":"267689f6-2a4c-4996-bd55-61ecc644a5b9","Type":"ContainerDied","Data":"50bc356faee11f0d00deab95619e3e6db86833191ae26f6caaa1c4a869387a18"} Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.524945 4991 scope.go:117] "RemoveContainer" containerID="27cdebf646172eb8c2d13bbd44f32fd1184730d2d7a9115c1e40efa64977d0dd" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.525073 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-mchn7" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.550218 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f9885cd76-4cxdt" event={"ID":"e9816fde-c4d0-4c01-8d09-2af0f4256fd1","Type":"ContainerStarted","Data":"ee15d8b8c296c51105a86cd7c291466f6a124af85521f5e19b4cfe08181eba2c"} Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.558997 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" event={"ID":"dbd747a7-d54f-46ac-9bde-b887a0450f66","Type":"ContainerStarted","Data":"53f5343e711f8d11492796264f9b06f1a654617f478363bc4604235655b56abb"} Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.573263 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.626432294 podStartE2EDuration="44.573244507s" podCreationTimestamp="2025-10-06 08:38:54 +0000 UTC" firstStartedPulling="2025-10-06 08:38:55.916716063 +0000 UTC m=+1187.654466084" lastFinishedPulling="2025-10-06 08:39:37.863528276 +0000 UTC m=+1229.601278297" observedRunningTime="2025-10-06 08:39:38.55046529 +0000 UTC m=+1230.288215311" watchObservedRunningTime="2025-10-06 08:39:38.573244507 +0000 UTC m=+1230.310994528" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.576382 4991 scope.go:117] "RemoveContainer" containerID="6f48c10672832be4a8045cde725a38924d95a8afded3d2328e80bb28744bf478" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.579127 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-mchn7"] Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.596737 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-mchn7"] Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.964501 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7988dccf5c-j9ll7"] Oct 06 08:39:38 crc kubenswrapper[4991]: E1006 08:39:38.965097 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267689f6-2a4c-4996-bd55-61ecc644a5b9" containerName="init" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.965114 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="267689f6-2a4c-4996-bd55-61ecc644a5b9" containerName="init" Oct 06 08:39:38 crc kubenswrapper[4991]: E1006 08:39:38.965127 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267689f6-2a4c-4996-bd55-61ecc644a5b9" containerName="dnsmasq-dns" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.965134 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="267689f6-2a4c-4996-bd55-61ecc644a5b9" containerName="dnsmasq-dns" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.965326 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="267689f6-2a4c-4996-bd55-61ecc644a5b9" containerName="dnsmasq-dns" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.966279 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.969668 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.969792 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 06 08:39:38 crc kubenswrapper[4991]: I1006 08:39:38.999198 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7988dccf5c-j9ll7"] Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.112628 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxshx\" (UniqueName: \"kubernetes.io/projected/0a6703e0-1fac-4734-98ac-88f6163fdaae-kube-api-access-lxshx\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.112698 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.112728 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-public-tls-certs\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.112769 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-internal-tls-certs\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.112852 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.112877 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-combined-ca-bundle\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.112899 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-ovndb-tls-certs\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.214971 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.215029 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-public-tls-certs\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.215081 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-internal-tls-certs\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.215160 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.215199 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-combined-ca-bundle\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.215228 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-ovndb-tls-certs\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.215271 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxshx\" (UniqueName: \"kubernetes.io/projected/0a6703e0-1fac-4734-98ac-88f6163fdaae-kube-api-access-lxshx\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.222991 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-internal-tls-certs\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.223120 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-ovndb-tls-certs\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.223655 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.224100 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-public-tls-certs\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.229211 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.231116 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-combined-ca-bundle\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.242202 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxshx\" (UniqueName: \"kubernetes.io/projected/0a6703e0-1fac-4734-98ac-88f6163fdaae-kube-api-access-lxshx\") pod \"neutron-7988dccf5c-j9ll7\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.293756 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267689f6-2a4c-4996-bd55-61ecc644a5b9" path="/var/lib/kubelet/pods/267689f6-2a4c-4996-bd55-61ecc644a5b9/volumes" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.376539 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.577652 4991 generic.go:334] "Generic (PLEG): container finished" podID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerID="4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0" exitCode=0 Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.577877 4991 generic.go:334] "Generic (PLEG): container finished" podID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerID="b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e" exitCode=2 Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.577884 4991 generic.go:334] "Generic (PLEG): container finished" podID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerID="e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651" exitCode=0 Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.578203 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7","Type":"ContainerDied","Data":"4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0"} Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.578236 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7","Type":"ContainerDied","Data":"b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e"} Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.578317 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7","Type":"ContainerDied","Data":"e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651"} Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.584583 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f9885cd76-4cxdt" event={"ID":"e9816fde-c4d0-4c01-8d09-2af0f4256fd1","Type":"ContainerStarted","Data":"46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323"} Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.584648 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f9885cd76-4cxdt" event={"ID":"e9816fde-c4d0-4c01-8d09-2af0f4256fd1","Type":"ContainerStarted","Data":"361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa"} Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.585540 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.607733 4991 generic.go:334] "Generic (PLEG): container finished" podID="dbd747a7-d54f-46ac-9bde-b887a0450f66" containerID="ca2360137afb48d76ce897a21f84ad332bc448e563435e9f6089ee7c7ec5822d" exitCode=0 Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.607773 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" event={"ID":"dbd747a7-d54f-46ac-9bde-b887a0450f66","Type":"ContainerDied","Data":"ca2360137afb48d76ce897a21f84ad332bc448e563435e9f6089ee7c7ec5822d"} Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.612047 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f9885cd76-4cxdt" podStartSLOduration=3.6120254149999997 podStartE2EDuration="3.612025415s" podCreationTimestamp="2025-10-06 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:39.605162616 +0000 UTC m=+1231.342912637" watchObservedRunningTime="2025-10-06 08:39:39.612025415 +0000 UTC m=+1231.349775436" Oct 06 08:39:39 crc kubenswrapper[4991]: I1006 08:39:39.948483 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7988dccf5c-j9ll7"] Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.204913 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.341415 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl2nm\" (UniqueName: \"kubernetes.io/projected/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-kube-api-access-nl2nm\") pod \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.341911 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-sg-core-conf-yaml\") pod \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.341947 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-run-httpd\") pod \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.341970 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-scripts\") pod \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.342003 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-config-data\") pod \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.342122 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-log-httpd\") pod \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.342186 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-combined-ca-bundle\") pod \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\" (UID: \"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7\") " Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.343535 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" (UID: "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.344730 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" (UID: "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.350166 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-kube-api-access-nl2nm" (OuterVolumeSpecName: "kube-api-access-nl2nm") pod "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" (UID: "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7"). InnerVolumeSpecName "kube-api-access-nl2nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.350428 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-scripts" (OuterVolumeSpecName: "scripts") pod "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" (UID: "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.381530 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" (UID: "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.419716 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" (UID: "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.456159 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.456193 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl2nm\" (UniqueName: \"kubernetes.io/projected/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-kube-api-access-nl2nm\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.456205 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.456214 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.456222 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.456229 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.483930 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-config-data" (OuterVolumeSpecName: "config-data") pod "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" (UID: "6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.557271 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.620786 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7988dccf5c-j9ll7" event={"ID":"0a6703e0-1fac-4734-98ac-88f6163fdaae","Type":"ContainerStarted","Data":"93e5b235f20e302b6749df9897200518a9608b53c7db75afd7a755bd7c31a9e2"} Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.620828 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7988dccf5c-j9ll7" event={"ID":"0a6703e0-1fac-4734-98ac-88f6163fdaae","Type":"ContainerStarted","Data":"d0aac78aa43c86da1a2d4708b970a7fa2c38a878adf032b4bc160cf815163a9d"} Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.620837 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7988dccf5c-j9ll7" event={"ID":"0a6703e0-1fac-4734-98ac-88f6163fdaae","Type":"ContainerStarted","Data":"bafafb918c15f75eb2e131f6d4885779b16767048667dbc16d890cfa68fdaa1f"} Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.621846 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.624279 4991 generic.go:334] "Generic (PLEG): container finished" podID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerID="b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7" exitCode=0 Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.624338 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7","Type":"ContainerDied","Data":"b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7"} Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.624369 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7","Type":"ContainerDied","Data":"0ccdea44e2cd5d77caed8416ae9b1bfe9c4e20489e228fe015bba5a525df56d4"} Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.624386 4991 scope.go:117] "RemoveContainer" containerID="4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.624497 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.635744 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" event={"ID":"dbd747a7-d54f-46ac-9bde-b887a0450f66","Type":"ContainerStarted","Data":"172548e0ef4cffabbdaced9b173d5753b7ee3d6c70c09640940f3f0dea8bf7cd"} Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.635797 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.651840 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7988dccf5c-j9ll7" podStartSLOduration=2.65181877 podStartE2EDuration="2.65181877s" podCreationTimestamp="2025-10-06 08:39:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:40.646622267 +0000 UTC m=+1232.384372288" watchObservedRunningTime="2025-10-06 08:39:40.65181877 +0000 UTC m=+1232.389568791" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.671799 4991 scope.go:117] "RemoveContainer" containerID="b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.685661 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" podStartSLOduration=4.685637651 podStartE2EDuration="4.685637651s" podCreationTimestamp="2025-10-06 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:40.679525013 +0000 UTC m=+1232.417275034" watchObservedRunningTime="2025-10-06 08:39:40.685637651 +0000 UTC m=+1232.423387682" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.718995 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.748479 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.769163 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:40 crc kubenswrapper[4991]: E1006 08:39:40.769646 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="ceilometer-notification-agent" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.769672 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="ceilometer-notification-agent" Oct 06 08:39:40 crc kubenswrapper[4991]: E1006 08:39:40.769697 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="sg-core" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.769705 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="sg-core" Oct 06 08:39:40 crc kubenswrapper[4991]: E1006 08:39:40.769729 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="proxy-httpd" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.769736 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="proxy-httpd" Oct 06 08:39:40 crc kubenswrapper[4991]: E1006 08:39:40.769762 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="ceilometer-central-agent" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.769776 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="ceilometer-central-agent" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.769993 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="sg-core" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.770021 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="proxy-httpd" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.770033 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="ceilometer-central-agent" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.770045 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" containerName="ceilometer-notification-agent" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.774236 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.778549 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.778945 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.790246 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.863381 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0071b3d-3fcd-477d-b161-1aff43447013-log-httpd\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.863448 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.863489 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0071b3d-3fcd-477d-b161-1aff43447013-run-httpd\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.863716 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-config-data\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.863731 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-scripts\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.863761 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l68b\" (UniqueName: \"kubernetes.io/projected/b0071b3d-3fcd-477d-b161-1aff43447013-kube-api-access-2l68b\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.863784 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.920444 4991 scope.go:117] "RemoveContainer" containerID="b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.965310 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l68b\" (UniqueName: \"kubernetes.io/projected/b0071b3d-3fcd-477d-b161-1aff43447013-kube-api-access-2l68b\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.965364 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.965439 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0071b3d-3fcd-477d-b161-1aff43447013-log-httpd\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.965479 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.965521 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0071b3d-3fcd-477d-b161-1aff43447013-run-httpd\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.965559 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-config-data\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.965579 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-scripts\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.966391 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0071b3d-3fcd-477d-b161-1aff43447013-run-httpd\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.967934 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0071b3d-3fcd-477d-b161-1aff43447013-log-httpd\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.972795 4991 scope.go:117] "RemoveContainer" containerID="e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.973485 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.973624 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-scripts\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.975476 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.980763 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-config-data\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:40 crc kubenswrapper[4991]: I1006 08:39:40.982673 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l68b\" (UniqueName: \"kubernetes.io/projected/b0071b3d-3fcd-477d-b161-1aff43447013-kube-api-access-2l68b\") pod \"ceilometer-0\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " pod="openstack/ceilometer-0" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.042131 4991 scope.go:117] "RemoveContainer" containerID="4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0" Oct 06 08:39:41 crc kubenswrapper[4991]: E1006 08:39:41.042850 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0\": container with ID starting with 4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0 not found: ID does not exist" containerID="4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.042892 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0"} err="failed to get container status \"4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0\": rpc error: code = NotFound desc = could not find container \"4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0\": container with ID starting with 4ddd5c9daa83d1e3013aabe70185e967709352d99536e9687894f47eca1a45c0 not found: ID does not exist" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.042920 4991 scope.go:117] "RemoveContainer" containerID="b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e" Oct 06 08:39:41 crc kubenswrapper[4991]: E1006 08:39:41.047652 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e\": container with ID starting with b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e not found: ID does not exist" containerID="b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.047729 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e"} err="failed to get container status \"b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e\": rpc error: code = NotFound desc = could not find container \"b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e\": container with ID starting with b77bb7076469e7eb7423697aa8a9be465bdf56b67c58b911bd9aeec20fd5441e not found: ID does not exist" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.047821 4991 scope.go:117] "RemoveContainer" containerID="b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7" Oct 06 08:39:41 crc kubenswrapper[4991]: E1006 08:39:41.048256 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7\": container with ID starting with b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7 not found: ID does not exist" containerID="b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.048317 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7"} err="failed to get container status \"b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7\": rpc error: code = NotFound desc = could not find container \"b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7\": container with ID starting with b2bcd03c3e81f8de06cbf61169ae7b6fd4eeb91aa762099d2ec07ab5568397c7 not found: ID does not exist" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.048356 4991 scope.go:117] "RemoveContainer" containerID="e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651" Oct 06 08:39:41 crc kubenswrapper[4991]: E1006 08:39:41.048821 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651\": container with ID starting with e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651 not found: ID does not exist" containerID="e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.048847 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651"} err="failed to get container status \"e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651\": rpc error: code = NotFound desc = could not find container \"e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651\": container with ID starting with e4a8e0848a5fe23fdfca53c9b3086c172a26fd0dbad330c17befeb44b2695651 not found: ID does not exist" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.217837 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.260575 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7" path="/var/lib/kubelet/pods/6e4f2d7f-dc49-418f-84b6-a5d8ad0c66f7/volumes" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.357858 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.642836 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-52tpz" event={"ID":"5ae62f13-d5be-414e-a6f9-9b2e475afbd1","Type":"ContainerStarted","Data":"bdf0cebdfc6bfe885875c71707250ed3c4a35ce750f74c8d41fb559482de14ee"} Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.688475 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-52tpz" podStartSLOduration=7.193054558 podStartE2EDuration="43.688457168s" podCreationTimestamp="2025-10-06 08:38:58 +0000 UTC" firstStartedPulling="2025-10-06 08:39:04.255035586 +0000 UTC m=+1195.992785607" lastFinishedPulling="2025-10-06 08:39:40.750438196 +0000 UTC m=+1232.488188217" observedRunningTime="2025-10-06 08:39:41.667688066 +0000 UTC m=+1233.405438097" watchObservedRunningTime="2025-10-06 08:39:41.688457168 +0000 UTC m=+1233.426207189" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.693110 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.697524 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.768129 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69b6ff7f94-j8svl"] Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.768400 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69b6ff7f94-j8svl" podUID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" containerName="barbican-api-log" containerID="cri-o://766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e" gracePeriod=30 Oct 06 08:39:41 crc kubenswrapper[4991]: I1006 08:39:41.768798 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69b6ff7f94-j8svl" podUID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" containerName="barbican-api" containerID="cri-o://59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1" gracePeriod=30 Oct 06 08:39:42 crc kubenswrapper[4991]: I1006 08:39:42.255357 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:39:42 crc kubenswrapper[4991]: I1006 08:39:42.654728 4991 generic.go:334] "Generic (PLEG): container finished" podID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" containerID="766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e" exitCode=143 Oct 06 08:39:42 crc kubenswrapper[4991]: I1006 08:39:42.654963 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b6ff7f94-j8svl" event={"ID":"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183","Type":"ContainerDied","Data":"766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e"} Oct 06 08:39:42 crc kubenswrapper[4991]: I1006 08:39:42.659384 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0071b3d-3fcd-477d-b161-1aff43447013","Type":"ContainerStarted","Data":"325e696150255f12531bc73944ff65e9d016656eb3fc3e1c4ac24eabd55f90f5"} Oct 06 08:39:43 crc kubenswrapper[4991]: I1006 08:39:43.668215 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0071b3d-3fcd-477d-b161-1aff43447013","Type":"ContainerStarted","Data":"246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d"} Oct 06 08:39:43 crc kubenswrapper[4991]: I1006 08:39:43.668529 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0071b3d-3fcd-477d-b161-1aff43447013","Type":"ContainerStarted","Data":"1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6"} Oct 06 08:39:44 crc kubenswrapper[4991]: I1006 08:39:44.677892 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0071b3d-3fcd-477d-b161-1aff43447013","Type":"ContainerStarted","Data":"65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96"} Oct 06 08:39:44 crc kubenswrapper[4991]: I1006 08:39:44.927339 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69b6ff7f94-j8svl" podUID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:42046->10.217.0.153:9311: read: connection reset by peer" Oct 06 08:39:44 crc kubenswrapper[4991]: I1006 08:39:44.927402 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69b6ff7f94-j8svl" podUID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:42030->10.217.0.153:9311: read: connection reset by peer" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.229873 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.326288 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.462199 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-logs\") pod \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.462723 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc64x\" (UniqueName: \"kubernetes.io/projected/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-kube-api-access-zc64x\") pod \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.462837 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-config-data-custom\") pod \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.462904 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-logs" (OuterVolumeSpecName: "logs") pod "7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" (UID: "7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.462953 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-combined-ca-bundle\") pod \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.463053 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-config-data\") pod \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\" (UID: \"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183\") " Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.463856 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.469872 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" (UID: "7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.475057 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-kube-api-access-zc64x" (OuterVolumeSpecName: "kube-api-access-zc64x") pod "7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" (UID: "7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183"). InnerVolumeSpecName "kube-api-access-zc64x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.498045 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" (UID: "7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.519728 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-config-data" (OuterVolumeSpecName: "config-data") pod "7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" (UID: "7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.565670 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc64x\" (UniqueName: \"kubernetes.io/projected/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-kube-api-access-zc64x\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.565710 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.565721 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.565732 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.688242 4991 generic.go:334] "Generic (PLEG): container finished" podID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" containerID="59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1" exitCode=0 Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.688291 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b6ff7f94-j8svl" event={"ID":"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183","Type":"ContainerDied","Data":"59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1"} Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.688343 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b6ff7f94-j8svl" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.688668 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b6ff7f94-j8svl" event={"ID":"7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183","Type":"ContainerDied","Data":"845bbf393ade326038caa2d545517f93cb7d38386f9ab9ae3be86e02500045a9"} Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.688707 4991 scope.go:117] "RemoveContainer" containerID="59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.691488 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0071b3d-3fcd-477d-b161-1aff43447013","Type":"ContainerStarted","Data":"178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743"} Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.691731 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.710954 4991 scope.go:117] "RemoveContainer" containerID="766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.733933 4991 scope.go:117] "RemoveContainer" containerID="59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1" Oct 06 08:39:45 crc kubenswrapper[4991]: E1006 08:39:45.736055 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1\": container with ID starting with 59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1 not found: ID does not exist" containerID="59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.736098 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1"} err="failed to get container status \"59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1\": rpc error: code = NotFound desc = could not find container \"59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1\": container with ID starting with 59e3171efb0dcfa06b78febe3c9d65cee102db00365b16b688e11c35a31124c1 not found: ID does not exist" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.736127 4991 scope.go:117] "RemoveContainer" containerID="766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e" Oct 06 08:39:45 crc kubenswrapper[4991]: E1006 08:39:45.736499 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e\": container with ID starting with 766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e not found: ID does not exist" containerID="766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.736528 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e"} err="failed to get container status \"766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e\": rpc error: code = NotFound desc = could not find container \"766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e\": container with ID starting with 766705f5a4846c38b668cc6539eac558bd2db62c8a42c5991cabb1c75fc73e5e not found: ID does not exist" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.737905 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.358661053 podStartE2EDuration="5.737892988s" podCreationTimestamp="2025-10-06 08:39:40 +0000 UTC" firstStartedPulling="2025-10-06 08:39:41.697491878 +0000 UTC m=+1233.435241899" lastFinishedPulling="2025-10-06 08:39:45.076723813 +0000 UTC m=+1236.814473834" observedRunningTime="2025-10-06 08:39:45.715229334 +0000 UTC m=+1237.452979355" watchObservedRunningTime="2025-10-06 08:39:45.737892988 +0000 UTC m=+1237.475642999" Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.751337 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69b6ff7f94-j8svl"] Oct 06 08:39:45 crc kubenswrapper[4991]: I1006 08:39:45.759282 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-69b6ff7f94-j8svl"] Oct 06 08:39:46 crc kubenswrapper[4991]: I1006 08:39:46.650547 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:39:46 crc kubenswrapper[4991]: I1006 08:39:46.710813 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-5v2tc"] Oct 06 08:39:46 crc kubenswrapper[4991]: I1006 08:39:46.711039 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" podUID="bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" containerName="dnsmasq-dns" containerID="cri-o://95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a" gracePeriod=10 Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.254281 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.254435 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" path="/var/lib/kubelet/pods/7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183/volumes" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.400207 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-dns-swift-storage-0\") pod \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.400939 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-ovsdbserver-sb\") pod \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.400972 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-dns-svc\") pod \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.401435 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-ovsdbserver-nb\") pod \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.401722 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-config\") pod \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.401763 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86gc4\" (UniqueName: \"kubernetes.io/projected/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-kube-api-access-86gc4\") pod \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\" (UID: \"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7\") " Oct 06 08:39:47 crc kubenswrapper[4991]: E1006 08:39:47.405269 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ae62f13_d5be_414e_a6f9_9b2e475afbd1.slice/crio-bdf0cebdfc6bfe885875c71707250ed3c4a35ce750f74c8d41fb559482de14ee.scope\": RecentStats: unable to find data in memory cache]" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.443249 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-kube-api-access-86gc4" (OuterVolumeSpecName: "kube-api-access-86gc4") pod "bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" (UID: "bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7"). InnerVolumeSpecName "kube-api-access-86gc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.466939 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" (UID: "bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.468227 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" (UID: "bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.468440 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-config" (OuterVolumeSpecName: "config") pod "bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" (UID: "bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.491375 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" (UID: "bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.504244 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.504289 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86gc4\" (UniqueName: \"kubernetes.io/projected/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-kube-api-access-86gc4\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.504322 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.504335 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.504346 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.506426 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" (UID: "bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.532380 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:47 crc kubenswrapper[4991]: E1006 08:39:47.532817 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" containerName="barbican-api" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.532843 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" containerName="barbican-api" Oct 06 08:39:47 crc kubenswrapper[4991]: E1006 08:39:47.532875 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" containerName="dnsmasq-dns" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.532885 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" containerName="dnsmasq-dns" Oct 06 08:39:47 crc kubenswrapper[4991]: E1006 08:39:47.532904 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" containerName="barbican-api-log" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.532913 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" containerName="barbican-api-log" Oct 06 08:39:47 crc kubenswrapper[4991]: E1006 08:39:47.532937 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" containerName="init" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.532947 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" containerName="init" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.533168 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" containerName="barbican-api" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.533211 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" containerName="dnsmasq-dns" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.533229 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e4fc4d6-40fb-4557-af1e-4ab7fe2c4183" containerName="barbican-api-log" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.546680 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.546797 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.555588 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.555921 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-6shj8" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.555792 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.606691 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12457ba4-6fb6-46ff-a838-563d370583bd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.606917 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnntr\" (UniqueName: \"kubernetes.io/projected/12457ba4-6fb6-46ff-a838-563d370583bd-kube-api-access-rnntr\") pod \"openstackclient\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.606984 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12457ba4-6fb6-46ff-a838-563d370583bd-openstack-config-secret\") pod \"openstackclient\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.607008 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12457ba4-6fb6-46ff-a838-563d370583bd-openstack-config\") pod \"openstackclient\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.607065 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.708763 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12457ba4-6fb6-46ff-a838-563d370583bd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.708844 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnntr\" (UniqueName: \"kubernetes.io/projected/12457ba4-6fb6-46ff-a838-563d370583bd-kube-api-access-rnntr\") pod \"openstackclient\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.708954 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12457ba4-6fb6-46ff-a838-563d370583bd-openstack-config-secret\") pod \"openstackclient\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.708995 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12457ba4-6fb6-46ff-a838-563d370583bd-openstack-config\") pod \"openstackclient\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.710134 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12457ba4-6fb6-46ff-a838-563d370583bd-openstack-config\") pod \"openstackclient\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.713427 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12457ba4-6fb6-46ff-a838-563d370583bd-openstack-config-secret\") pod \"openstackclient\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.715267 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12457ba4-6fb6-46ff-a838-563d370583bd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.717521 4991 generic.go:334] "Generic (PLEG): container finished" podID="bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" containerID="95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a" exitCode=0 Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.717591 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" event={"ID":"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7","Type":"ContainerDied","Data":"95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a"} Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.717621 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" event={"ID":"bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7","Type":"ContainerDied","Data":"431c1d1856e249ecd350ad2d0a4bfa30ce2dece9f342471c9a14615743a7ff08"} Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.717624 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-5v2tc" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.717640 4991 scope.go:117] "RemoveContainer" containerID="95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.722747 4991 generic.go:334] "Generic (PLEG): container finished" podID="5ae62f13-d5be-414e-a6f9-9b2e475afbd1" containerID="bdf0cebdfc6bfe885875c71707250ed3c4a35ce750f74c8d41fb559482de14ee" exitCode=0 Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.722784 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-52tpz" event={"ID":"5ae62f13-d5be-414e-a6f9-9b2e475afbd1","Type":"ContainerDied","Data":"bdf0cebdfc6bfe885875c71707250ed3c4a35ce750f74c8d41fb559482de14ee"} Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.741548 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnntr\" (UniqueName: \"kubernetes.io/projected/12457ba4-6fb6-46ff-a838-563d370583bd-kube-api-access-rnntr\") pod \"openstackclient\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.753165 4991 scope.go:117] "RemoveContainer" containerID="524550fb96c879169babacbf6f1cd0e64e6e9d9caac08b3a69e95c63c429b2e8" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.753310 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.754026 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.767098 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.806341 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-5v2tc"] Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.813629 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-5v2tc"] Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.822720 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.824035 4991 scope.go:117] "RemoveContainer" containerID="95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.824097 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: E1006 08:39:47.828169 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a\": container with ID starting with 95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a not found: ID does not exist" containerID="95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.828211 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a"} err="failed to get container status \"95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a\": rpc error: code = NotFound desc = could not find container \"95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a\": container with ID starting with 95613f94c1ee9d981e1f955cce1d6cfe5d7e3828c0be39ca25f17b8ca314f31a not found: ID does not exist" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.828240 4991 scope.go:117] "RemoveContainer" containerID="524550fb96c879169babacbf6f1cd0e64e6e9d9caac08b3a69e95c63c429b2e8" Oct 06 08:39:47 crc kubenswrapper[4991]: E1006 08:39:47.828730 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"524550fb96c879169babacbf6f1cd0e64e6e9d9caac08b3a69e95c63c429b2e8\": container with ID starting with 524550fb96c879169babacbf6f1cd0e64e6e9d9caac08b3a69e95c63c429b2e8 not found: ID does not exist" containerID="524550fb96c879169babacbf6f1cd0e64e6e9d9caac08b3a69e95c63c429b2e8" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.828757 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"524550fb96c879169babacbf6f1cd0e64e6e9d9caac08b3a69e95c63c429b2e8"} err="failed to get container status \"524550fb96c879169babacbf6f1cd0e64e6e9d9caac08b3a69e95c63c429b2e8\": rpc error: code = NotFound desc = could not find container \"524550fb96c879169babacbf6f1cd0e64e6e9d9caac08b3a69e95c63c429b2e8\": container with ID starting with 524550fb96c879169babacbf6f1cd0e64e6e9d9caac08b3a69e95c63c429b2e8 not found: ID does not exist" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.837795 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:47 crc kubenswrapper[4991]: E1006 08:39:47.898964 4991 log.go:32] "RunPodSandbox from runtime service failed" err=< Oct 06 08:39:47 crc kubenswrapper[4991]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_12457ba4-6fb6-46ff-a838-563d370583bd_0(1a009ae4e0ed72a315d0008c375282a654d5ba3d973ddff84e64d720d2962d62): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1a009ae4e0ed72a315d0008c375282a654d5ba3d973ddff84e64d720d2962d62" Netns:"/var/run/netns/90825276-5460-4191-8c1a-b2216c1b0928" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=1a009ae4e0ed72a315d0008c375282a654d5ba3d973ddff84e64d720d2962d62;K8S_POD_UID=12457ba4-6fb6-46ff-a838-563d370583bd" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/12457ba4-6fb6-46ff-a838-563d370583bd]: expected pod UID "12457ba4-6fb6-46ff-a838-563d370583bd" but got "e8e91b06-a3c1-41dc-b2f8-af738647ade8" from Kube API Oct 06 08:39:47 crc kubenswrapper[4991]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 06 08:39:47 crc kubenswrapper[4991]: > Oct 06 08:39:47 crc kubenswrapper[4991]: E1006 08:39:47.899037 4991 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Oct 06 08:39:47 crc kubenswrapper[4991]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_12457ba4-6fb6-46ff-a838-563d370583bd_0(1a009ae4e0ed72a315d0008c375282a654d5ba3d973ddff84e64d720d2962d62): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1a009ae4e0ed72a315d0008c375282a654d5ba3d973ddff84e64d720d2962d62" Netns:"/var/run/netns/90825276-5460-4191-8c1a-b2216c1b0928" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=1a009ae4e0ed72a315d0008c375282a654d5ba3d973ddff84e64d720d2962d62;K8S_POD_UID=12457ba4-6fb6-46ff-a838-563d370583bd" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/12457ba4-6fb6-46ff-a838-563d370583bd]: expected pod UID "12457ba4-6fb6-46ff-a838-563d370583bd" but got "e8e91b06-a3c1-41dc-b2f8-af738647ade8" from Kube API Oct 06 08:39:47 crc kubenswrapper[4991]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 06 08:39:47 crc kubenswrapper[4991]: > pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.912942 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8e91b06-a3c1-41dc-b2f8-af738647ade8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.912998 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8e91b06-a3c1-41dc-b2f8-af738647ade8-openstack-config-secret\") pod \"openstackclient\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.913091 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8e91b06-a3c1-41dc-b2f8-af738647ade8-openstack-config\") pod \"openstackclient\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " pod="openstack/openstackclient" Oct 06 08:39:47 crc kubenswrapper[4991]: I1006 08:39:47.913327 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzk82\" (UniqueName: \"kubernetes.io/projected/e8e91b06-a3c1-41dc-b2f8-af738647ade8-kube-api-access-zzk82\") pod \"openstackclient\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " pod="openstack/openstackclient" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.015516 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8e91b06-a3c1-41dc-b2f8-af738647ade8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " pod="openstack/openstackclient" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.015567 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8e91b06-a3c1-41dc-b2f8-af738647ade8-openstack-config-secret\") pod \"openstackclient\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " pod="openstack/openstackclient" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.015626 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8e91b06-a3c1-41dc-b2f8-af738647ade8-openstack-config\") pod \"openstackclient\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " pod="openstack/openstackclient" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.015706 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzk82\" (UniqueName: \"kubernetes.io/projected/e8e91b06-a3c1-41dc-b2f8-af738647ade8-kube-api-access-zzk82\") pod \"openstackclient\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " pod="openstack/openstackclient" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.017140 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8e91b06-a3c1-41dc-b2f8-af738647ade8-openstack-config\") pod \"openstackclient\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " pod="openstack/openstackclient" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.020258 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8e91b06-a3c1-41dc-b2f8-af738647ade8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " pod="openstack/openstackclient" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.022832 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8e91b06-a3c1-41dc-b2f8-af738647ade8-openstack-config-secret\") pod \"openstackclient\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " pod="openstack/openstackclient" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.032619 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzk82\" (UniqueName: \"kubernetes.io/projected/e8e91b06-a3c1-41dc-b2f8-af738647ade8-kube-api-access-zzk82\") pod \"openstackclient\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " pod="openstack/openstackclient" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.145006 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:48 crc kubenswrapper[4991]: W1006 08:39:48.626618 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8e91b06_a3c1_41dc_b2f8_af738647ade8.slice/crio-e0f3942eca2775e8d394768a9a7631a254217c5c85c027947f1be02fccfec5a9 WatchSource:0}: Error finding container e0f3942eca2775e8d394768a9a7631a254217c5c85c027947f1be02fccfec5a9: Status 404 returned error can't find the container with id e0f3942eca2775e8d394768a9a7631a254217c5c85c027947f1be02fccfec5a9 Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.633183 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.740216 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"e8e91b06-a3c1-41dc-b2f8-af738647ade8","Type":"ContainerStarted","Data":"e0f3942eca2775e8d394768a9a7631a254217c5c85c027947f1be02fccfec5a9"} Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.743406 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.749236 4991 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="12457ba4-6fb6-46ff-a838-563d370583bd" podUID="e8e91b06-a3c1-41dc-b2f8-af738647ade8" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.757987 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.830916 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12457ba4-6fb6-46ff-a838-563d370583bd-openstack-config\") pod \"12457ba4-6fb6-46ff-a838-563d370583bd\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.831055 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12457ba4-6fb6-46ff-a838-563d370583bd-combined-ca-bundle\") pod \"12457ba4-6fb6-46ff-a838-563d370583bd\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.831177 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12457ba4-6fb6-46ff-a838-563d370583bd-openstack-config-secret\") pod \"12457ba4-6fb6-46ff-a838-563d370583bd\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.831288 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnntr\" (UniqueName: \"kubernetes.io/projected/12457ba4-6fb6-46ff-a838-563d370583bd-kube-api-access-rnntr\") pod \"12457ba4-6fb6-46ff-a838-563d370583bd\" (UID: \"12457ba4-6fb6-46ff-a838-563d370583bd\") " Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.831964 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12457ba4-6fb6-46ff-a838-563d370583bd-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "12457ba4-6fb6-46ff-a838-563d370583bd" (UID: "12457ba4-6fb6-46ff-a838-563d370583bd"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.833480 4991 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/12457ba4-6fb6-46ff-a838-563d370583bd-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.839157 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12457ba4-6fb6-46ff-a838-563d370583bd-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "12457ba4-6fb6-46ff-a838-563d370583bd" (UID: "12457ba4-6fb6-46ff-a838-563d370583bd"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.841868 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12457ba4-6fb6-46ff-a838-563d370583bd-kube-api-access-rnntr" (OuterVolumeSpecName: "kube-api-access-rnntr") pod "12457ba4-6fb6-46ff-a838-563d370583bd" (UID: "12457ba4-6fb6-46ff-a838-563d370583bd"). InnerVolumeSpecName "kube-api-access-rnntr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.848398 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12457ba4-6fb6-46ff-a838-563d370583bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12457ba4-6fb6-46ff-a838-563d370583bd" (UID: "12457ba4-6fb6-46ff-a838-563d370583bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.935483 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnntr\" (UniqueName: \"kubernetes.io/projected/12457ba4-6fb6-46ff-a838-563d370583bd-kube-api-access-rnntr\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.935524 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12457ba4-6fb6-46ff-a838-563d370583bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:48 crc kubenswrapper[4991]: I1006 08:39:48.935537 4991 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/12457ba4-6fb6-46ff-a838-563d370583bd-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.127897 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-52tpz" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.241567 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-db-sync-config-data\") pod \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.241656 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-scripts\") pod \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.241725 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-combined-ca-bundle\") pod \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.241785 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9b2r\" (UniqueName: \"kubernetes.io/projected/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-kube-api-access-r9b2r\") pod \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.241846 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-config-data\") pod \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.241913 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-etc-machine-id\") pod \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\" (UID: \"5ae62f13-d5be-414e-a6f9-9b2e475afbd1\") " Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.242247 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5ae62f13-d5be-414e-a6f9-9b2e475afbd1" (UID: "5ae62f13-d5be-414e-a6f9-9b2e475afbd1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.248398 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-scripts" (OuterVolumeSpecName: "scripts") pod "5ae62f13-d5be-414e-a6f9-9b2e475afbd1" (UID: "5ae62f13-d5be-414e-a6f9-9b2e475afbd1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.254457 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5ae62f13-d5be-414e-a6f9-9b2e475afbd1" (UID: "5ae62f13-d5be-414e-a6f9-9b2e475afbd1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.258791 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-kube-api-access-r9b2r" (OuterVolumeSpecName: "kube-api-access-r9b2r") pod "5ae62f13-d5be-414e-a6f9-9b2e475afbd1" (UID: "5ae62f13-d5be-414e-a6f9-9b2e475afbd1"). InnerVolumeSpecName "kube-api-access-r9b2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.287638 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12457ba4-6fb6-46ff-a838-563d370583bd" path="/var/lib/kubelet/pods/12457ba4-6fb6-46ff-a838-563d370583bd/volumes" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.288897 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7" path="/var/lib/kubelet/pods/bf4a2fa0-7d04-45e4-b5f1-7aa004e635a7/volumes" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.298000 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ae62f13-d5be-414e-a6f9-9b2e475afbd1" (UID: "5ae62f13-d5be-414e-a6f9-9b2e475afbd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.316095 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-config-data" (OuterVolumeSpecName: "config-data") pod "5ae62f13-d5be-414e-a6f9-9b2e475afbd1" (UID: "5ae62f13-d5be-414e-a6f9-9b2e475afbd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.344281 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.344330 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9b2r\" (UniqueName: \"kubernetes.io/projected/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-kube-api-access-r9b2r\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.344344 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.344355 4991 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.344369 4991 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.344381 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ae62f13-d5be-414e-a6f9-9b2e475afbd1-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.756735 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.756736 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-52tpz" event={"ID":"5ae62f13-d5be-414e-a6f9-9b2e475afbd1","Type":"ContainerDied","Data":"d6f3fc17ada598288a6a54ff08d21ccacf2eb549155e9cde49458a5d06e1c105"} Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.756788 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-52tpz" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.756801 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6f3fc17ada598288a6a54ff08d21ccacf2eb549155e9cde49458a5d06e1c105" Oct 06 08:39:49 crc kubenswrapper[4991]: I1006 08:39:49.763222 4991 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="12457ba4-6fb6-46ff-a838-563d370583bd" podUID="e8e91b06-a3c1-41dc-b2f8-af738647ade8" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.036548 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:39:50 crc kubenswrapper[4991]: E1006 08:39:50.037053 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae62f13-d5be-414e-a6f9-9b2e475afbd1" containerName="cinder-db-sync" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.037070 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae62f13-d5be-414e-a6f9-9b2e475afbd1" containerName="cinder-db-sync" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.037357 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae62f13-d5be-414e-a6f9-9b2e475afbd1" containerName="cinder-db-sync" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.038565 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.046108 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.046397 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.046566 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tpjld" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.046733 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.062410 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-scripts\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.062453 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.062493 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/217a269d-973f-4e3f-bd2c-a057fb4c1525-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.062521 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-config-data\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.062560 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.062605 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxxj5\" (UniqueName: \"kubernetes.io/projected/217a269d-973f-4e3f-bd2c-a057fb4c1525-kube-api-access-bxxj5\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.067744 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.087178 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-79twg"] Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.090770 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.098119 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-79twg"] Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.165285 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-config\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.165398 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-scripts\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.165432 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.165469 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb4d8\" (UniqueName: \"kubernetes.io/projected/0f68ff49-be6e-460f-91e6-ec7d260e0aff-kube-api-access-jb4d8\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.165505 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/217a269d-973f-4e3f-bd2c-a057fb4c1525-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.165543 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-config-data\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.165582 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.165619 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.165662 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.165691 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.165719 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxxj5\" (UniqueName: \"kubernetes.io/projected/217a269d-973f-4e3f-bd2c-a057fb4c1525-kube-api-access-bxxj5\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.165772 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.171104 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-scripts\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.173387 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/217a269d-973f-4e3f-bd2c-a057fb4c1525-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.176090 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.191020 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.222918 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxxj5\" (UniqueName: \"kubernetes.io/projected/217a269d-973f-4e3f-bd2c-a057fb4c1525-kube-api-access-bxxj5\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.223348 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-config-data\") pod \"cinder-scheduler-0\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.235745 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.237178 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.246218 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.261818 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.268902 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.269519 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-config\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.269599 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-scripts\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.269624 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24014e12-e104-4da6-9df0-257c78a9a5db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.269664 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-config-data\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.269708 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb4d8\" (UniqueName: \"kubernetes.io/projected/0f68ff49-be6e-460f-91e6-ec7d260e0aff-kube-api-access-jb4d8\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.269769 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.269796 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24014e12-e104-4da6-9df0-257c78a9a5db-logs\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.269820 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5tvh\" (UniqueName: \"kubernetes.io/projected/24014e12-e104-4da6-9df0-257c78a9a5db-kube-api-access-m5tvh\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.269877 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.269919 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.269950 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.269980 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-config-data-custom\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.274722 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.275068 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-config\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.275537 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.275826 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.276173 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.306844 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb4d8\" (UniqueName: \"kubernetes.io/projected/0f68ff49-be6e-460f-91e6-ec7d260e0aff-kube-api-access-jb4d8\") pod \"dnsmasq-dns-6bb4fc677f-79twg\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.371654 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-config-data-custom\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.372028 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-scripts\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.372049 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24014e12-e104-4da6-9df0-257c78a9a5db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.372067 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-config-data\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.372112 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.372127 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24014e12-e104-4da6-9df0-257c78a9a5db-logs\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.372144 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5tvh\" (UniqueName: \"kubernetes.io/projected/24014e12-e104-4da6-9df0-257c78a9a5db-kube-api-access-m5tvh\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.372750 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24014e12-e104-4da6-9df0-257c78a9a5db-logs\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.374597 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24014e12-e104-4da6-9df0-257c78a9a5db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.378446 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-scripts\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.378697 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.379767 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.392210 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-config-data\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.398944 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5tvh\" (UniqueName: \"kubernetes.io/projected/24014e12-e104-4da6-9df0-257c78a9a5db-kube-api-access-m5tvh\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.402850 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-config-data-custom\") pod \"cinder-api-0\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.411455 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.611881 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.884696 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-79twg"] Oct 06 08:39:50 crc kubenswrapper[4991]: I1006 08:39:50.991154 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:39:51 crc kubenswrapper[4991]: W1006 08:39:51.003889 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod217a269d_973f_4e3f_bd2c_a057fb4c1525.slice/crio-87f41283332689b70a0866f40fbe265e5f9aa577470ced45afe62f90c55f5eec WatchSource:0}: Error finding container 87f41283332689b70a0866f40fbe265e5f9aa577470ced45afe62f90c55f5eec: Status 404 returned error can't find the container with id 87f41283332689b70a0866f40fbe265e5f9aa577470ced45afe62f90c55f5eec Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.145660 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:51 crc kubenswrapper[4991]: W1006 08:39:51.150415 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24014e12_e104_4da6_9df0_257c78a9a5db.slice/crio-160989ca35c0c2b633ed0fd3d529fec7d869f31e81d2a3894fb368417e5cabca WatchSource:0}: Error finding container 160989ca35c0c2b633ed0fd3d529fec7d869f31e81d2a3894fb368417e5cabca: Status 404 returned error can't find the container with id 160989ca35c0c2b633ed0fd3d529fec7d869f31e81d2a3894fb368417e5cabca Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.750588 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-856f6664f9-gqcn7"] Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.752837 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.755179 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.755741 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.772924 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.781907 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-856f6664f9-gqcn7"] Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.803690 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801bcc07-7874-4eb8-8447-40178d80ea09-run-httpd\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.803771 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-public-tls-certs\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.803816 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-combined-ca-bundle\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.803876 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dkpc\" (UniqueName: \"kubernetes.io/projected/801bcc07-7874-4eb8-8447-40178d80ea09-kube-api-access-2dkpc\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.804097 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-config-data\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.804168 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-internal-tls-certs\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.804189 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/801bcc07-7874-4eb8-8447-40178d80ea09-etc-swift\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.804228 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801bcc07-7874-4eb8-8447-40178d80ea09-log-httpd\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.815595 4991 generic.go:334] "Generic (PLEG): container finished" podID="0f68ff49-be6e-460f-91e6-ec7d260e0aff" containerID="923f5be9c04c65e2efbd24f3d3427b1c024bf3f30fc4d1912cc3c429e3c344cd" exitCode=0 Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.815667 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" event={"ID":"0f68ff49-be6e-460f-91e6-ec7d260e0aff","Type":"ContainerDied","Data":"923f5be9c04c65e2efbd24f3d3427b1c024bf3f30fc4d1912cc3c429e3c344cd"} Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.815698 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" event={"ID":"0f68ff49-be6e-460f-91e6-ec7d260e0aff","Type":"ContainerStarted","Data":"21328eefffe50960b453c10904ea78ee8e4abe5a8c6152acf5097cffd82244ca"} Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.848189 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"24014e12-e104-4da6-9df0-257c78a9a5db","Type":"ContainerStarted","Data":"160989ca35c0c2b633ed0fd3d529fec7d869f31e81d2a3894fb368417e5cabca"} Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.856667 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"217a269d-973f-4e3f-bd2c-a057fb4c1525","Type":"ContainerStarted","Data":"87f41283332689b70a0866f40fbe265e5f9aa577470ced45afe62f90c55f5eec"} Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.908554 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-internal-tls-certs\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.908607 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/801bcc07-7874-4eb8-8447-40178d80ea09-etc-swift\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.908673 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801bcc07-7874-4eb8-8447-40178d80ea09-log-httpd\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.908730 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801bcc07-7874-4eb8-8447-40178d80ea09-run-httpd\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.908802 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-public-tls-certs\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.908861 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-combined-ca-bundle\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.908912 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dkpc\" (UniqueName: \"kubernetes.io/projected/801bcc07-7874-4eb8-8447-40178d80ea09-kube-api-access-2dkpc\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.908984 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-config-data\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.911280 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801bcc07-7874-4eb8-8447-40178d80ea09-log-httpd\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.911597 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801bcc07-7874-4eb8-8447-40178d80ea09-run-httpd\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.917612 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-internal-tls-certs\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.929134 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-public-tls-certs\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.929219 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-config-data\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.929393 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/801bcc07-7874-4eb8-8447-40178d80ea09-etc-swift\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.930571 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-combined-ca-bundle\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:51 crc kubenswrapper[4991]: I1006 08:39:51.932080 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dkpc\" (UniqueName: \"kubernetes.io/projected/801bcc07-7874-4eb8-8447-40178d80ea09-kube-api-access-2dkpc\") pod \"swift-proxy-856f6664f9-gqcn7\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:52 crc kubenswrapper[4991]: I1006 08:39:52.098782 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:52 crc kubenswrapper[4991]: I1006 08:39:52.274285 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:52 crc kubenswrapper[4991]: I1006 08:39:52.855257 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-856f6664f9-gqcn7"] Oct 06 08:39:52 crc kubenswrapper[4991]: W1006 08:39:52.867087 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801bcc07_7874_4eb8_8447_40178d80ea09.slice/crio-9dcfd2196f73e6739092d1903de65b86b03c7b97db9d23f6b170f3f95c13a843 WatchSource:0}: Error finding container 9dcfd2196f73e6739092d1903de65b86b03c7b97db9d23f6b170f3f95c13a843: Status 404 returned error can't find the container with id 9dcfd2196f73e6739092d1903de65b86b03c7b97db9d23f6b170f3f95c13a843 Oct 06 08:39:52 crc kubenswrapper[4991]: I1006 08:39:52.882432 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" event={"ID":"0f68ff49-be6e-460f-91e6-ec7d260e0aff","Type":"ContainerStarted","Data":"ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3"} Oct 06 08:39:52 crc kubenswrapper[4991]: I1006 08:39:52.882506 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:39:52 crc kubenswrapper[4991]: I1006 08:39:52.902867 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"24014e12-e104-4da6-9df0-257c78a9a5db","Type":"ContainerStarted","Data":"461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3"} Oct 06 08:39:52 crc kubenswrapper[4991]: I1006 08:39:52.904557 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" podStartSLOduration=2.9045424349999998 podStartE2EDuration="2.904542435s" podCreationTimestamp="2025-10-06 08:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:52.902775066 +0000 UTC m=+1244.640525087" watchObservedRunningTime="2025-10-06 08:39:52.904542435 +0000 UTC m=+1244.642292456" Oct 06 08:39:52 crc kubenswrapper[4991]: I1006 08:39:52.907591 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"217a269d-973f-4e3f-bd2c-a057fb4c1525","Type":"ContainerStarted","Data":"a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872"} Oct 06 08:39:53 crc kubenswrapper[4991]: I1006 08:39:53.918921 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-856f6664f9-gqcn7" event={"ID":"801bcc07-7874-4eb8-8447-40178d80ea09","Type":"ContainerStarted","Data":"2b98780e70d84a8aec415e425c48e44718a23c872945a5f7884260c8ef099a6e"} Oct 06 08:39:53 crc kubenswrapper[4991]: I1006 08:39:53.919231 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-856f6664f9-gqcn7" event={"ID":"801bcc07-7874-4eb8-8447-40178d80ea09","Type":"ContainerStarted","Data":"7145e5dc1f6f11d5b7c94e4ed5a3f94d613b31585eea12e4bfb621d2b89f737e"} Oct 06 08:39:53 crc kubenswrapper[4991]: I1006 08:39:53.919245 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-856f6664f9-gqcn7" event={"ID":"801bcc07-7874-4eb8-8447-40178d80ea09","Type":"ContainerStarted","Data":"9dcfd2196f73e6739092d1903de65b86b03c7b97db9d23f6b170f3f95c13a843"} Oct 06 08:39:53 crc kubenswrapper[4991]: I1006 08:39:53.919258 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:53 crc kubenswrapper[4991]: I1006 08:39:53.922240 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"217a269d-973f-4e3f-bd2c-a057fb4c1525","Type":"ContainerStarted","Data":"20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972"} Oct 06 08:39:53 crc kubenswrapper[4991]: I1006 08:39:53.926253 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="24014e12-e104-4da6-9df0-257c78a9a5db" containerName="cinder-api-log" containerID="cri-o://461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3" gracePeriod=30 Oct 06 08:39:53 crc kubenswrapper[4991]: I1006 08:39:53.926538 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"24014e12-e104-4da6-9df0-257c78a9a5db","Type":"ContainerStarted","Data":"75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a"} Oct 06 08:39:53 crc kubenswrapper[4991]: I1006 08:39:53.926572 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="24014e12-e104-4da6-9df0-257c78a9a5db" containerName="cinder-api" containerID="cri-o://75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a" gracePeriod=30 Oct 06 08:39:53 crc kubenswrapper[4991]: I1006 08:39:53.926620 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 08:39:53 crc kubenswrapper[4991]: I1006 08:39:53.944231 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-856f6664f9-gqcn7" podStartSLOduration=2.944210334 podStartE2EDuration="2.944210334s" podCreationTimestamp="2025-10-06 08:39:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:53.938249508 +0000 UTC m=+1245.675999529" watchObservedRunningTime="2025-10-06 08:39:53.944210334 +0000 UTC m=+1245.681960355" Oct 06 08:39:53 crc kubenswrapper[4991]: I1006 08:39:53.962343 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.049991366 podStartE2EDuration="4.962327662s" podCreationTimestamp="2025-10-06 08:39:49 +0000 UTC" firstStartedPulling="2025-10-06 08:39:51.008126486 +0000 UTC m=+1242.745876507" lastFinishedPulling="2025-10-06 08:39:51.920462782 +0000 UTC m=+1243.658212803" observedRunningTime="2025-10-06 08:39:53.961512439 +0000 UTC m=+1245.699262460" watchObservedRunningTime="2025-10-06 08:39:53.962327662 +0000 UTC m=+1245.700077673" Oct 06 08:39:53 crc kubenswrapper[4991]: I1006 08:39:53.988827 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.988805353 podStartE2EDuration="3.988805353s" podCreationTimestamp="2025-10-06 08:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:39:53.979385719 +0000 UTC m=+1245.717135740" watchObservedRunningTime="2025-10-06 08:39:53.988805353 +0000 UTC m=+1245.726555384" Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.396149 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.396784 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="ceilometer-central-agent" containerID="cri-o://1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6" gracePeriod=30 Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.396899 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="sg-core" containerID="cri-o://65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96" gracePeriod=30 Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.396909 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="proxy-httpd" containerID="cri-o://178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743" gracePeriod=30 Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.396938 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="ceilometer-notification-agent" containerID="cri-o://246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d" gracePeriod=30 Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.799855 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.953789 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-config-data-custom\") pod \"24014e12-e104-4da6-9df0-257c78a9a5db\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.954150 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24014e12-e104-4da6-9df0-257c78a9a5db-etc-machine-id\") pod \"24014e12-e104-4da6-9df0-257c78a9a5db\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.954272 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5tvh\" (UniqueName: \"kubernetes.io/projected/24014e12-e104-4da6-9df0-257c78a9a5db-kube-api-access-m5tvh\") pod \"24014e12-e104-4da6-9df0-257c78a9a5db\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.954343 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-scripts\") pod \"24014e12-e104-4da6-9df0-257c78a9a5db\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.954399 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-config-data\") pod \"24014e12-e104-4da6-9df0-257c78a9a5db\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.954399 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24014e12-e104-4da6-9df0-257c78a9a5db-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "24014e12-e104-4da6-9df0-257c78a9a5db" (UID: "24014e12-e104-4da6-9df0-257c78a9a5db"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.954464 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-combined-ca-bundle\") pod \"24014e12-e104-4da6-9df0-257c78a9a5db\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.954482 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24014e12-e104-4da6-9df0-257c78a9a5db-logs\") pod \"24014e12-e104-4da6-9df0-257c78a9a5db\" (UID: \"24014e12-e104-4da6-9df0-257c78a9a5db\") " Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.954869 4991 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24014e12-e104-4da6-9df0-257c78a9a5db-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.955236 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24014e12-e104-4da6-9df0-257c78a9a5db-logs" (OuterVolumeSpecName: "logs") pod "24014e12-e104-4da6-9df0-257c78a9a5db" (UID: "24014e12-e104-4da6-9df0-257c78a9a5db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.967596 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24014e12-e104-4da6-9df0-257c78a9a5db-kube-api-access-m5tvh" (OuterVolumeSpecName: "kube-api-access-m5tvh") pod "24014e12-e104-4da6-9df0-257c78a9a5db" (UID: "24014e12-e104-4da6-9df0-257c78a9a5db"). InnerVolumeSpecName "kube-api-access-m5tvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.969059 4991 generic.go:334] "Generic (PLEG): container finished" podID="b0071b3d-3fcd-477d-b161-1aff43447013" containerID="178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743" exitCode=0 Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.969103 4991 generic.go:334] "Generic (PLEG): container finished" podID="b0071b3d-3fcd-477d-b161-1aff43447013" containerID="65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96" exitCode=2 Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.969114 4991 generic.go:334] "Generic (PLEG): container finished" podID="b0071b3d-3fcd-477d-b161-1aff43447013" containerID="1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6" exitCode=0 Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.969159 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "24014e12-e104-4da6-9df0-257c78a9a5db" (UID: "24014e12-e104-4da6-9df0-257c78a9a5db"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.969206 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0071b3d-3fcd-477d-b161-1aff43447013","Type":"ContainerDied","Data":"178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743"} Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.969239 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0071b3d-3fcd-477d-b161-1aff43447013","Type":"ContainerDied","Data":"65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96"} Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.969255 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0071b3d-3fcd-477d-b161-1aff43447013","Type":"ContainerDied","Data":"1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6"} Oct 06 08:39:54 crc kubenswrapper[4991]: I1006 08:39:54.971319 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-scripts" (OuterVolumeSpecName: "scripts") pod "24014e12-e104-4da6-9df0-257c78a9a5db" (UID: "24014e12-e104-4da6-9df0-257c78a9a5db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:54.995604 4991 generic.go:334] "Generic (PLEG): container finished" podID="24014e12-e104-4da6-9df0-257c78a9a5db" containerID="75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a" exitCode=0 Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:54.995641 4991 generic.go:334] "Generic (PLEG): container finished" podID="24014e12-e104-4da6-9df0-257c78a9a5db" containerID="461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3" exitCode=143 Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:54.996771 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:54.997249 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"24014e12-e104-4da6-9df0-257c78a9a5db","Type":"ContainerDied","Data":"75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a"} Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:54.997274 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"24014e12-e104-4da6-9df0-257c78a9a5db","Type":"ContainerDied","Data":"461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3"} Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:54.997284 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"24014e12-e104-4da6-9df0-257c78a9a5db","Type":"ContainerDied","Data":"160989ca35c0c2b633ed0fd3d529fec7d869f31e81d2a3894fb368417e5cabca"} Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:54.997315 4991 scope.go:117] "RemoveContainer" containerID="75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:54.997956 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.057127 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5tvh\" (UniqueName: \"kubernetes.io/projected/24014e12-e104-4da6-9df0-257c78a9a5db-kube-api-access-m5tvh\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.057157 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.057168 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24014e12-e104-4da6-9df0-257c78a9a5db-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.057179 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.097368 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24014e12-e104-4da6-9df0-257c78a9a5db" (UID: "24014e12-e104-4da6-9df0-257c78a9a5db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.099504 4991 scope.go:117] "RemoveContainer" containerID="461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.156697 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-config-data" (OuterVolumeSpecName: "config-data") pod "24014e12-e104-4da6-9df0-257c78a9a5db" (UID: "24014e12-e104-4da6-9df0-257c78a9a5db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.160634 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.160667 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24014e12-e104-4da6-9df0-257c78a9a5db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.192967 4991 scope.go:117] "RemoveContainer" containerID="75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a" Oct 06 08:39:55 crc kubenswrapper[4991]: E1006 08:39:55.193364 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a\": container with ID starting with 75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a not found: ID does not exist" containerID="75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.193407 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a"} err="failed to get container status \"75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a\": rpc error: code = NotFound desc = could not find container \"75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a\": container with ID starting with 75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a not found: ID does not exist" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.193434 4991 scope.go:117] "RemoveContainer" containerID="461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3" Oct 06 08:39:55 crc kubenswrapper[4991]: E1006 08:39:55.193721 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3\": container with ID starting with 461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3 not found: ID does not exist" containerID="461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.193760 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3"} err="failed to get container status \"461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3\": rpc error: code = NotFound desc = could not find container \"461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3\": container with ID starting with 461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3 not found: ID does not exist" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.193791 4991 scope.go:117] "RemoveContainer" containerID="75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.194007 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a"} err="failed to get container status \"75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a\": rpc error: code = NotFound desc = could not find container \"75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a\": container with ID starting with 75c553063f32305bee8ee9bf21910d7e1ed0a3986633d8d4a0da9c2a138abc7a not found: ID does not exist" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.194034 4991 scope.go:117] "RemoveContainer" containerID="461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.194234 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3"} err="failed to get container status \"461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3\": rpc error: code = NotFound desc = could not find container \"461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3\": container with ID starting with 461d8db8a02effad38dd22a84958f38c3183127798f67ac389fbf4e649bc50f3 not found: ID does not exist" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.342188 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.355142 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.370389 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:55 crc kubenswrapper[4991]: E1006 08:39:55.370867 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24014e12-e104-4da6-9df0-257c78a9a5db" containerName="cinder-api" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.370889 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="24014e12-e104-4da6-9df0-257c78a9a5db" containerName="cinder-api" Oct 06 08:39:55 crc kubenswrapper[4991]: E1006 08:39:55.370939 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24014e12-e104-4da6-9df0-257c78a9a5db" containerName="cinder-api-log" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.370946 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="24014e12-e104-4da6-9df0-257c78a9a5db" containerName="cinder-api-log" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.371150 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="24014e12-e104-4da6-9df0-257c78a9a5db" containerName="cinder-api-log" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.371185 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="24014e12-e104-4da6-9df0-257c78a9a5db" containerName="cinder-api" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.372628 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.376789 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.377677 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.377825 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.379964 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.389996 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.467169 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-config-data-custom\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.467624 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-scripts\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.467646 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-config-data\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.467697 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.467752 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb6nm\" (UniqueName: \"kubernetes.io/projected/815c282e-cc40-4ff8-b3f8-155d9a91a20b-kube-api-access-zb6nm\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.467813 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.467837 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/815c282e-cc40-4ff8-b3f8-155d9a91a20b-logs\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.467859 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.467893 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/815c282e-cc40-4ff8-b3f8-155d9a91a20b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.569900 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-config-data-custom\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.569963 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-scripts\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.569985 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-config-data\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.570016 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.570048 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb6nm\" (UniqueName: \"kubernetes.io/projected/815c282e-cc40-4ff8-b3f8-155d9a91a20b-kube-api-access-zb6nm\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.570063 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.570079 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/815c282e-cc40-4ff8-b3f8-155d9a91a20b-logs\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.570094 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.570116 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/815c282e-cc40-4ff8-b3f8-155d9a91a20b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.570192 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/815c282e-cc40-4ff8-b3f8-155d9a91a20b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.779085 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/815c282e-cc40-4ff8-b3f8-155d9a91a20b-logs\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.779616 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.779684 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-scripts\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.780010 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.780208 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-config-data\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.780501 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-config-data-custom\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.781189 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.782817 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb6nm\" (UniqueName: \"kubernetes.io/projected/815c282e-cc40-4ff8-b3f8-155d9a91a20b-kube-api-access-zb6nm\") pod \"cinder-api-0\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " pod="openstack/cinder-api-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.816078 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.976992 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0071b3d-3fcd-477d-b161-1aff43447013-run-httpd\") pod \"b0071b3d-3fcd-477d-b161-1aff43447013\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.977391 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-combined-ca-bundle\") pod \"b0071b3d-3fcd-477d-b161-1aff43447013\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.977441 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-scripts\") pod \"b0071b3d-3fcd-477d-b161-1aff43447013\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.977466 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0071b3d-3fcd-477d-b161-1aff43447013-log-httpd\") pod \"b0071b3d-3fcd-477d-b161-1aff43447013\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.977527 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l68b\" (UniqueName: \"kubernetes.io/projected/b0071b3d-3fcd-477d-b161-1aff43447013-kube-api-access-2l68b\") pod \"b0071b3d-3fcd-477d-b161-1aff43447013\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.977546 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-config-data\") pod \"b0071b3d-3fcd-477d-b161-1aff43447013\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.977604 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0071b3d-3fcd-477d-b161-1aff43447013-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b0071b3d-3fcd-477d-b161-1aff43447013" (UID: "b0071b3d-3fcd-477d-b161-1aff43447013"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.977640 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-sg-core-conf-yaml\") pod \"b0071b3d-3fcd-477d-b161-1aff43447013\" (UID: \"b0071b3d-3fcd-477d-b161-1aff43447013\") " Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.978070 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0071b3d-3fcd-477d-b161-1aff43447013-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.978214 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0071b3d-3fcd-477d-b161-1aff43447013-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b0071b3d-3fcd-477d-b161-1aff43447013" (UID: "b0071b3d-3fcd-477d-b161-1aff43447013"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.982219 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0071b3d-3fcd-477d-b161-1aff43447013-kube-api-access-2l68b" (OuterVolumeSpecName: "kube-api-access-2l68b") pod "b0071b3d-3fcd-477d-b161-1aff43447013" (UID: "b0071b3d-3fcd-477d-b161-1aff43447013"). InnerVolumeSpecName "kube-api-access-2l68b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:39:55 crc kubenswrapper[4991]: I1006 08:39:55.984408 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-scripts" (OuterVolumeSpecName: "scripts") pod "b0071b3d-3fcd-477d-b161-1aff43447013" (UID: "b0071b3d-3fcd-477d-b161-1aff43447013"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.010628 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b0071b3d-3fcd-477d-b161-1aff43447013" (UID: "b0071b3d-3fcd-477d-b161-1aff43447013"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.016219 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.021861 4991 generic.go:334] "Generic (PLEG): container finished" podID="b0071b3d-3fcd-477d-b161-1aff43447013" containerID="246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d" exitCode=0 Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.022022 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0071b3d-3fcd-477d-b161-1aff43447013","Type":"ContainerDied","Data":"246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d"} Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.022248 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0071b3d-3fcd-477d-b161-1aff43447013","Type":"ContainerDied","Data":"325e696150255f12531bc73944ff65e9d016656eb3fc3e1c4ac24eabd55f90f5"} Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.022279 4991 scope.go:117] "RemoveContainer" containerID="178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.022518 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.062052 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0071b3d-3fcd-477d-b161-1aff43447013" (UID: "b0071b3d-3fcd-477d-b161-1aff43447013"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.080582 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.080623 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0071b3d-3fcd-477d-b161-1aff43447013-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.080640 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l68b\" (UniqueName: \"kubernetes.io/projected/b0071b3d-3fcd-477d-b161-1aff43447013-kube-api-access-2l68b\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.080652 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.080670 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.093878 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-config-data" (OuterVolumeSpecName: "config-data") pod "b0071b3d-3fcd-477d-b161-1aff43447013" (UID: "b0071b3d-3fcd-477d-b161-1aff43447013"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.182715 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0071b3d-3fcd-477d-b161-1aff43447013-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.376549 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.385596 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.396097 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:56 crc kubenswrapper[4991]: E1006 08:39:56.396556 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="ceilometer-notification-agent" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.396579 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="ceilometer-notification-agent" Oct 06 08:39:56 crc kubenswrapper[4991]: E1006 08:39:56.396596 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="sg-core" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.396603 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="sg-core" Oct 06 08:39:56 crc kubenswrapper[4991]: E1006 08:39:56.396615 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="ceilometer-central-agent" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.396621 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="ceilometer-central-agent" Oct 06 08:39:56 crc kubenswrapper[4991]: E1006 08:39:56.396638 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="proxy-httpd" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.396644 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="proxy-httpd" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.396839 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="proxy-httpd" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.396862 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="ceilometer-central-agent" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.396871 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="ceilometer-notification-agent" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.396878 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" containerName="sg-core" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.398605 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.401833 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.402186 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.410011 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.488071 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f00c734e-514a-472e-8c2d-5adacef0d316-run-httpd\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.488242 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.488280 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqwjk\" (UniqueName: \"kubernetes.io/projected/f00c734e-514a-472e-8c2d-5adacef0d316-kube-api-access-tqwjk\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.488321 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-config-data\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.488376 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.488397 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f00c734e-514a-472e-8c2d-5adacef0d316-log-httpd\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.488439 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-scripts\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.590138 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-scripts\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.590273 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f00c734e-514a-472e-8c2d-5adacef0d316-run-httpd\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.590311 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.590343 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqwjk\" (UniqueName: \"kubernetes.io/projected/f00c734e-514a-472e-8c2d-5adacef0d316-kube-api-access-tqwjk\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.590363 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-config-data\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.590404 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.590422 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f00c734e-514a-472e-8c2d-5adacef0d316-log-httpd\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.590790 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f00c734e-514a-472e-8c2d-5adacef0d316-log-httpd\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.595349 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-scripts\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.596353 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f00c734e-514a-472e-8c2d-5adacef0d316-run-httpd\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.598236 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.600451 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-config-data\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.612454 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.613517 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqwjk\" (UniqueName: \"kubernetes.io/projected/f00c734e-514a-472e-8c2d-5adacef0d316-kube-api-access-tqwjk\") pod \"ceilometer-0\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " pod="openstack/ceilometer-0" Oct 06 08:39:56 crc kubenswrapper[4991]: I1006 08:39:56.725985 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:39:57 crc kubenswrapper[4991]: I1006 08:39:57.254199 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24014e12-e104-4da6-9df0-257c78a9a5db" path="/var/lib/kubelet/pods/24014e12-e104-4da6-9df0-257c78a9a5db/volumes" Oct 06 08:39:57 crc kubenswrapper[4991]: I1006 08:39:57.254951 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0071b3d-3fcd-477d-b161-1aff43447013" path="/var/lib/kubelet/pods/b0071b3d-3fcd-477d-b161-1aff43447013/volumes" Oct 06 08:39:57 crc kubenswrapper[4991]: I1006 08:39:57.528930 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:39:57 crc kubenswrapper[4991]: I1006 08:39:57.529248 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:39:58 crc kubenswrapper[4991]: I1006 08:39:58.419164 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:39:58 crc kubenswrapper[4991]: I1006 08:39:58.419532 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c526d349-85d0-4d1a-9994-4394742e3051" containerName="glance-log" containerID="cri-o://63fb2fb78ee5a293703b11c47bf6171b4cd47f6081d25abdd8ec70aa73cb45ba" gracePeriod=30 Oct 06 08:39:58 crc kubenswrapper[4991]: I1006 08:39:58.419600 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c526d349-85d0-4d1a-9994-4394742e3051" containerName="glance-httpd" containerID="cri-o://cd1debd69e908cac90a16b7137761847b47db6686c56a11e1822475462f4e431" gracePeriod=30 Oct 06 08:39:59 crc kubenswrapper[4991]: I1006 08:39:59.066818 4991 generic.go:334] "Generic (PLEG): container finished" podID="c526d349-85d0-4d1a-9994-4394742e3051" containerID="63fb2fb78ee5a293703b11c47bf6171b4cd47f6081d25abdd8ec70aa73cb45ba" exitCode=143 Oct 06 08:39:59 crc kubenswrapper[4991]: I1006 08:39:59.067130 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c526d349-85d0-4d1a-9994-4394742e3051","Type":"ContainerDied","Data":"63fb2fb78ee5a293703b11c47bf6171b4cd47f6081d25abdd8ec70aa73cb45ba"} Oct 06 08:39:59 crc kubenswrapper[4991]: I1006 08:39:59.781455 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.417577 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.495586 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-m72rm"] Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.496873 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m72rm" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.510229 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-t4nb8"] Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.510470 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" podUID="dbd747a7-d54f-46ac-9bde-b887a0450f66" containerName="dnsmasq-dns" containerID="cri-o://172548e0ef4cffabbdaced9b173d5753b7ee3d6c70c09640940f3f0dea8bf7cd" gracePeriod=10 Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.535796 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-m72rm"] Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.563226 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlg22\" (UniqueName: \"kubernetes.io/projected/5d55f5bf-87aa-4993-a295-05b740129150-kube-api-access-mlg22\") pod \"nova-api-db-create-m72rm\" (UID: \"5d55f5bf-87aa-4993-a295-05b740129150\") " pod="openstack/nova-api-db-create-m72rm" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.590704 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-6pm4v"] Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.592586 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6pm4v" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.601471 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6pm4v"] Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.669594 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlg22\" (UniqueName: \"kubernetes.io/projected/5d55f5bf-87aa-4993-a295-05b740129150-kube-api-access-mlg22\") pod \"nova-api-db-create-m72rm\" (UID: \"5d55f5bf-87aa-4993-a295-05b740129150\") " pod="openstack/nova-api-db-create-m72rm" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.669736 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxts7\" (UniqueName: \"kubernetes.io/projected/2f29fae6-0696-4c14-8b68-94c800349ada-kube-api-access-vxts7\") pod \"nova-cell0-db-create-6pm4v\" (UID: \"2f29fae6-0696-4c14-8b68-94c800349ada\") " pod="openstack/nova-cell0-db-create-6pm4v" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.687879 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-z229c"] Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.690998 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z229c" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.695353 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-z229c"] Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.701488 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlg22\" (UniqueName: \"kubernetes.io/projected/5d55f5bf-87aa-4993-a295-05b740129150-kube-api-access-mlg22\") pod \"nova-api-db-create-m72rm\" (UID: \"5d55f5bf-87aa-4993-a295-05b740129150\") " pod="openstack/nova-api-db-create-m72rm" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.770881 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2pp5\" (UniqueName: \"kubernetes.io/projected/0254b022-c378-4fff-bc49-15778c28e8e0-kube-api-access-c2pp5\") pod \"nova-cell1-db-create-z229c\" (UID: \"0254b022-c378-4fff-bc49-15778c28e8e0\") " pod="openstack/nova-cell1-db-create-z229c" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.770989 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxts7\" (UniqueName: \"kubernetes.io/projected/2f29fae6-0696-4c14-8b68-94c800349ada-kube-api-access-vxts7\") pod \"nova-cell0-db-create-6pm4v\" (UID: \"2f29fae6-0696-4c14-8b68-94c800349ada\") " pod="openstack/nova-cell0-db-create-6pm4v" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.785388 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.786967 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxts7\" (UniqueName: \"kubernetes.io/projected/2f29fae6-0696-4c14-8b68-94c800349ada-kube-api-access-vxts7\") pod \"nova-cell0-db-create-6pm4v\" (UID: \"2f29fae6-0696-4c14-8b68-94c800349ada\") " pod="openstack/nova-cell0-db-create-6pm4v" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.862533 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m72rm" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.873828 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2pp5\" (UniqueName: \"kubernetes.io/projected/0254b022-c378-4fff-bc49-15778c28e8e0-kube-api-access-c2pp5\") pod \"nova-cell1-db-create-z229c\" (UID: \"0254b022-c378-4fff-bc49-15778c28e8e0\") " pod="openstack/nova-cell1-db-create-z229c" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.893628 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.902104 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2pp5\" (UniqueName: \"kubernetes.io/projected/0254b022-c378-4fff-bc49-15778c28e8e0-kube-api-access-c2pp5\") pod \"nova-cell1-db-create-z229c\" (UID: \"0254b022-c378-4fff-bc49-15778c28e8e0\") " pod="openstack/nova-cell1-db-create-z229c" Oct 06 08:40:00 crc kubenswrapper[4991]: I1006 08:40:00.978614 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6pm4v" Oct 06 08:40:01 crc kubenswrapper[4991]: I1006 08:40:01.064083 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z229c" Oct 06 08:40:01 crc kubenswrapper[4991]: I1006 08:40:01.086143 4991 generic.go:334] "Generic (PLEG): container finished" podID="dbd747a7-d54f-46ac-9bde-b887a0450f66" containerID="172548e0ef4cffabbdaced9b173d5753b7ee3d6c70c09640940f3f0dea8bf7cd" exitCode=0 Oct 06 08:40:01 crc kubenswrapper[4991]: I1006 08:40:01.086242 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" event={"ID":"dbd747a7-d54f-46ac-9bde-b887a0450f66","Type":"ContainerDied","Data":"172548e0ef4cffabbdaced9b173d5753b7ee3d6c70c09640940f3f0dea8bf7cd"} Oct 06 08:40:01 crc kubenswrapper[4991]: I1006 08:40:01.086374 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="217a269d-973f-4e3f-bd2c-a057fb4c1525" containerName="cinder-scheduler" containerID="cri-o://a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872" gracePeriod=30 Oct 06 08:40:01 crc kubenswrapper[4991]: I1006 08:40:01.086428 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="217a269d-973f-4e3f-bd2c-a057fb4c1525" containerName="probe" containerID="cri-o://20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972" gracePeriod=30 Oct 06 08:40:01 crc kubenswrapper[4991]: I1006 08:40:01.649388 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" podUID="dbd747a7-d54f-46ac-9bde-b887a0450f66" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.155:5353: connect: connection refused" Oct 06 08:40:01 crc kubenswrapper[4991]: I1006 08:40:01.886984 4991 scope.go:117] "RemoveContainer" containerID="65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.044559 4991 scope.go:117] "RemoveContainer" containerID="246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.133143 4991 scope.go:117] "RemoveContainer" containerID="1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.133244 4991 generic.go:334] "Generic (PLEG): container finished" podID="c526d349-85d0-4d1a-9994-4394742e3051" containerID="cd1debd69e908cac90a16b7137761847b47db6686c56a11e1822475462f4e431" exitCode=0 Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.133268 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c526d349-85d0-4d1a-9994-4394742e3051","Type":"ContainerDied","Data":"cd1debd69e908cac90a16b7137761847b47db6686c56a11e1822475462f4e431"} Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.137027 4991 generic.go:334] "Generic (PLEG): container finished" podID="217a269d-973f-4e3f-bd2c-a057fb4c1525" containerID="20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972" exitCode=0 Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.137076 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"217a269d-973f-4e3f-bd2c-a057fb4c1525","Type":"ContainerDied","Data":"20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972"} Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.187664 4991 scope.go:117] "RemoveContainer" containerID="178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743" Oct 06 08:40:02 crc kubenswrapper[4991]: E1006 08:40:02.190502 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743\": container with ID starting with 178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743 not found: ID does not exist" containerID="178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.190553 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743"} err="failed to get container status \"178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743\": rpc error: code = NotFound desc = could not find container \"178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743\": container with ID starting with 178097c0b50dfd235ff1f86a440080dfd60c973406c2adc1fd5725ab032a8743 not found: ID does not exist" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.190583 4991 scope.go:117] "RemoveContainer" containerID="65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96" Oct 06 08:40:02 crc kubenswrapper[4991]: E1006 08:40:02.191115 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96\": container with ID starting with 65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96 not found: ID does not exist" containerID="65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.191142 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96"} err="failed to get container status \"65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96\": rpc error: code = NotFound desc = could not find container \"65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96\": container with ID starting with 65adb6de11630dc1ad217445343e307a830b1691273c09d6a3d78729ae8c5e96 not found: ID does not exist" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.191626 4991 scope.go:117] "RemoveContainer" containerID="246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d" Oct 06 08:40:02 crc kubenswrapper[4991]: E1006 08:40:02.192018 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d\": container with ID starting with 246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d not found: ID does not exist" containerID="246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.192063 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d"} err="failed to get container status \"246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d\": rpc error: code = NotFound desc = could not find container \"246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d\": container with ID starting with 246c43a5fe7384698fac6e7360b0c1c5c90512f8bceca5fcbbdc14fb8536699d not found: ID does not exist" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.192091 4991 scope.go:117] "RemoveContainer" containerID="1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6" Oct 06 08:40:02 crc kubenswrapper[4991]: E1006 08:40:02.192527 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6\": container with ID starting with 1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6 not found: ID does not exist" containerID="1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.192560 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6"} err="failed to get container status \"1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6\": rpc error: code = NotFound desc = could not find container \"1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6\": container with ID starting with 1ac4fb297dc8247e4883b8d485bac7d3aca1e7f2ee5f10b5423206dd0fdd31f6 not found: ID does not exist" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.194994 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.195915 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.198169 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.303848 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-config\") pod \"dbd747a7-d54f-46ac-9bde-b887a0450f66\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.303924 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmqn5\" (UniqueName: \"kubernetes.io/projected/dbd747a7-d54f-46ac-9bde-b887a0450f66-kube-api-access-rmqn5\") pod \"dbd747a7-d54f-46ac-9bde-b887a0450f66\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.303968 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-dns-swift-storage-0\") pod \"dbd747a7-d54f-46ac-9bde-b887a0450f66\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.304008 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-ovsdbserver-sb\") pod \"dbd747a7-d54f-46ac-9bde-b887a0450f66\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.304036 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-dns-svc\") pod \"dbd747a7-d54f-46ac-9bde-b887a0450f66\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.304063 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-ovsdbserver-nb\") pod \"dbd747a7-d54f-46ac-9bde-b887a0450f66\" (UID: \"dbd747a7-d54f-46ac-9bde-b887a0450f66\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.314434 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd747a7-d54f-46ac-9bde-b887a0450f66-kube-api-access-rmqn5" (OuterVolumeSpecName: "kube-api-access-rmqn5") pod "dbd747a7-d54f-46ac-9bde-b887a0450f66" (UID: "dbd747a7-d54f-46ac-9bde-b887a0450f66"). InnerVolumeSpecName "kube-api-access-rmqn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.358753 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-config" (OuterVolumeSpecName: "config") pod "dbd747a7-d54f-46ac-9bde-b887a0450f66" (UID: "dbd747a7-d54f-46ac-9bde-b887a0450f66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.366813 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dbd747a7-d54f-46ac-9bde-b887a0450f66" (UID: "dbd747a7-d54f-46ac-9bde-b887a0450f66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.367144 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dbd747a7-d54f-46ac-9bde-b887a0450f66" (UID: "dbd747a7-d54f-46ac-9bde-b887a0450f66"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.383348 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dbd747a7-d54f-46ac-9bde-b887a0450f66" (UID: "dbd747a7-d54f-46ac-9bde-b887a0450f66"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.384704 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dbd747a7-d54f-46ac-9bde-b887a0450f66" (UID: "dbd747a7-d54f-46ac-9bde-b887a0450f66"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.405801 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.405832 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmqn5\" (UniqueName: \"kubernetes.io/projected/dbd747a7-d54f-46ac-9bde-b887a0450f66-kube-api-access-rmqn5\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.405842 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.405850 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.405860 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.405867 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd747a7-d54f-46ac-9bde-b887a0450f66-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.585089 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.602282 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6pm4v"] Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.613940 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-z229c"] Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.693895 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.710013 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-scripts\") pod \"c526d349-85d0-4d1a-9994-4394742e3051\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.710056 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-config-data\") pod \"c526d349-85d0-4d1a-9994-4394742e3051\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.710112 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-internal-tls-certs\") pod \"c526d349-85d0-4d1a-9994-4394742e3051\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.710134 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km9zg\" (UniqueName: \"kubernetes.io/projected/c526d349-85d0-4d1a-9994-4394742e3051-kube-api-access-km9zg\") pod \"c526d349-85d0-4d1a-9994-4394742e3051\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.710263 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-combined-ca-bundle\") pod \"c526d349-85d0-4d1a-9994-4394742e3051\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.710287 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"c526d349-85d0-4d1a-9994-4394742e3051\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.710363 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c526d349-85d0-4d1a-9994-4394742e3051-httpd-run\") pod \"c526d349-85d0-4d1a-9994-4394742e3051\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.710416 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c526d349-85d0-4d1a-9994-4394742e3051-logs\") pod \"c526d349-85d0-4d1a-9994-4394742e3051\" (UID: \"c526d349-85d0-4d1a-9994-4394742e3051\") " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.711059 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c526d349-85d0-4d1a-9994-4394742e3051-logs" (OuterVolumeSpecName: "logs") pod "c526d349-85d0-4d1a-9994-4394742e3051" (UID: "c526d349-85d0-4d1a-9994-4394742e3051"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.711352 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c526d349-85d0-4d1a-9994-4394742e3051-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c526d349-85d0-4d1a-9994-4394742e3051" (UID: "c526d349-85d0-4d1a-9994-4394742e3051"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.744231 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "c526d349-85d0-4d1a-9994-4394742e3051" (UID: "c526d349-85d0-4d1a-9994-4394742e3051"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.750436 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c526d349-85d0-4d1a-9994-4394742e3051-kube-api-access-km9zg" (OuterVolumeSpecName: "kube-api-access-km9zg") pod "c526d349-85d0-4d1a-9994-4394742e3051" (UID: "c526d349-85d0-4d1a-9994-4394742e3051"). InnerVolumeSpecName "kube-api-access-km9zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.760528 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-scripts" (OuterVolumeSpecName: "scripts") pod "c526d349-85d0-4d1a-9994-4394742e3051" (UID: "c526d349-85d0-4d1a-9994-4394742e3051"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.771962 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c526d349-85d0-4d1a-9994-4394742e3051" (UID: "c526d349-85d0-4d1a-9994-4394742e3051"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.816998 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c526d349-85d0-4d1a-9994-4394742e3051-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.817034 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.817046 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km9zg\" (UniqueName: \"kubernetes.io/projected/c526d349-85d0-4d1a-9994-4394742e3051-kube-api-access-km9zg\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.817061 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.817086 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.817099 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c526d349-85d0-4d1a-9994-4394742e3051-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.831799 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.840882 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c526d349-85d0-4d1a-9994-4394742e3051" (UID: "c526d349-85d0-4d1a-9994-4394742e3051"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.842678 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-m72rm"] Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.851707 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-config-data" (OuterVolumeSpecName: "config-data") pod "c526d349-85d0-4d1a-9994-4394742e3051" (UID: "c526d349-85d0-4d1a-9994-4394742e3051"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.854511 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.918702 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.918737 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c526d349-85d0-4d1a-9994-4394742e3051-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:02 crc kubenswrapper[4991]: I1006 08:40:02.918748 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.150177 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"815c282e-cc40-4ff8-b3f8-155d9a91a20b","Type":"ContainerStarted","Data":"6007afbe8efe47aa81a20c944be444478bb0924cc26fd218c84bb8b618e51bb4"} Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.165100 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c526d349-85d0-4d1a-9994-4394742e3051","Type":"ContainerDied","Data":"9f7056fb53c728de0f27984386ce146258e84dd9e824c3816b338e72f6d109f2"} Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.165152 4991 scope.go:117] "RemoveContainer" containerID="cd1debd69e908cac90a16b7137761847b47db6686c56a11e1822475462f4e431" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.165250 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.180542 4991 generic.go:334] "Generic (PLEG): container finished" podID="2f29fae6-0696-4c14-8b68-94c800349ada" containerID="7b234e61f6c430aa76d75399799f1ef37b2b243dd9d9dd8ad0e6b5d63e0347e7" exitCode=0 Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.180632 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6pm4v" event={"ID":"2f29fae6-0696-4c14-8b68-94c800349ada","Type":"ContainerDied","Data":"7b234e61f6c430aa76d75399799f1ef37b2b243dd9d9dd8ad0e6b5d63e0347e7"} Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.180659 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6pm4v" event={"ID":"2f29fae6-0696-4c14-8b68-94c800349ada","Type":"ContainerStarted","Data":"88898a1ecb21eb8a11deb944b33f0cc14f3b743865da5b0c3a2f881368825c9f"} Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.185035 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f00c734e-514a-472e-8c2d-5adacef0d316","Type":"ContainerStarted","Data":"b3ed5cfda5719139b37f37623e4c30e7b391c027fe6721e80dd38d13dca848a5"} Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.190144 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"e8e91b06-a3c1-41dc-b2f8-af738647ade8","Type":"ContainerStarted","Data":"81c13c33c57ac2a7fafdd527f3ce9a1ddf23d76bafb0a388ddb5af43c282a8b4"} Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.199358 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z229c" event={"ID":"0254b022-c378-4fff-bc49-15778c28e8e0","Type":"ContainerStarted","Data":"b418bee21a0d9a97cc7e25f384e4781bb8a1da9561b86a4b569400fcf968b49c"} Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.199396 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z229c" event={"ID":"0254b022-c378-4fff-bc49-15778c28e8e0","Type":"ContainerStarted","Data":"d4eef865d2ac3288819050f2dcf578c24854bc19b1dc27148cb6cf4cfd24dfe9"} Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.201940 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m72rm" event={"ID":"5d55f5bf-87aa-4993-a295-05b740129150","Type":"ContainerStarted","Data":"6f082bbf3e1565940b74d18d50bbd5c48ddb65bd11017a0ddbb2d9ba6eb94b64"} Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.205045 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" event={"ID":"dbd747a7-d54f-46ac-9bde-b887a0450f66","Type":"ContainerDied","Data":"53f5343e711f8d11492796264f9b06f1a654617f478363bc4604235655b56abb"} Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.205056 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-t4nb8" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.216235 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.236917 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.237850 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.979435366 podStartE2EDuration="16.237838163s" podCreationTimestamp="2025-10-06 08:39:47 +0000 UTC" firstStartedPulling="2025-10-06 08:39:48.628585946 +0000 UTC m=+1240.366335967" lastFinishedPulling="2025-10-06 08:40:01.886988743 +0000 UTC m=+1253.624738764" observedRunningTime="2025-10-06 08:40:03.212873884 +0000 UTC m=+1254.950623915" watchObservedRunningTime="2025-10-06 08:40:03.237838163 +0000 UTC m=+1254.975588184" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.265534 4991 scope.go:117] "RemoveContainer" containerID="63fb2fb78ee5a293703b11c47bf6171b4cd47f6081d25abdd8ec70aa73cb45ba" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.316094 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-z229c" podStartSLOduration=3.316068413 podStartE2EDuration="3.316068413s" podCreationTimestamp="2025-10-06 08:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:03.253083689 +0000 UTC m=+1254.990833710" watchObservedRunningTime="2025-10-06 08:40:03.316068413 +0000 UTC m=+1255.053818434" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.326323 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c526d349-85d0-4d1a-9994-4394742e3051" path="/var/lib/kubelet/pods/c526d349-85d0-4d1a-9994-4394742e3051/volumes" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.327039 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:40:03 crc kubenswrapper[4991]: E1006 08:40:03.327499 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd747a7-d54f-46ac-9bde-b887a0450f66" containerName="dnsmasq-dns" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.327520 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd747a7-d54f-46ac-9bde-b887a0450f66" containerName="dnsmasq-dns" Oct 06 08:40:03 crc kubenswrapper[4991]: E1006 08:40:03.327537 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c526d349-85d0-4d1a-9994-4394742e3051" containerName="glance-log" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.327543 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c526d349-85d0-4d1a-9994-4394742e3051" containerName="glance-log" Oct 06 08:40:03 crc kubenswrapper[4991]: E1006 08:40:03.327557 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd747a7-d54f-46ac-9bde-b887a0450f66" containerName="init" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.327563 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd747a7-d54f-46ac-9bde-b887a0450f66" containerName="init" Oct 06 08:40:03 crc kubenswrapper[4991]: E1006 08:40:03.327574 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c526d349-85d0-4d1a-9994-4394742e3051" containerName="glance-httpd" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.327581 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c526d349-85d0-4d1a-9994-4394742e3051" containerName="glance-httpd" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.327759 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c526d349-85d0-4d1a-9994-4394742e3051" containerName="glance-httpd" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.327769 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd747a7-d54f-46ac-9bde-b887a0450f66" containerName="dnsmasq-dns" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.327780 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c526d349-85d0-4d1a-9994-4394742e3051" containerName="glance-log" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.328927 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.329015 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.331585 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.331717 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.401801 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-t4nb8"] Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.404960 4991 scope.go:117] "RemoveContainer" containerID="172548e0ef4cffabbdaced9b173d5753b7ee3d6c70c09640940f3f0dea8bf7cd" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.413242 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-t4nb8"] Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.430053 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.430372 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.430480 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.430554 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa57b1fb-c743-4137-9501-a0110f385b1c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.430707 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa57b1fb-c743-4137-9501-a0110f385b1c-logs\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.430810 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.430894 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.431029 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6dk6\" (UniqueName: \"kubernetes.io/projected/aa57b1fb-c743-4137-9501-a0110f385b1c-kube-api-access-t6dk6\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.452874 4991 scope.go:117] "RemoveContainer" containerID="ca2360137afb48d76ce897a21f84ad332bc448e563435e9f6089ee7c7ec5822d" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.532755 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa57b1fb-c743-4137-9501-a0110f385b1c-logs\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.532819 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.532864 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.533447 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa57b1fb-c743-4137-9501-a0110f385b1c-logs\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.533560 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6dk6\" (UniqueName: \"kubernetes.io/projected/aa57b1fb-c743-4137-9501-a0110f385b1c-kube-api-access-t6dk6\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.533609 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.533658 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.533697 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.533714 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa57b1fb-c743-4137-9501-a0110f385b1c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.534402 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.536744 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa57b1fb-c743-4137-9501-a0110f385b1c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.538744 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.541598 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.542262 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.547246 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.554744 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6dk6\" (UniqueName: \"kubernetes.io/projected/aa57b1fb-c743-4137-9501-a0110f385b1c-kube-api-access-t6dk6\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.575654 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " pod="openstack/glance-default-internal-api-0" Oct 06 08:40:03 crc kubenswrapper[4991]: I1006 08:40:03.706951 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:04 crc kubenswrapper[4991]: I1006 08:40:04.219453 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f00c734e-514a-472e-8c2d-5adacef0d316","Type":"ContainerStarted","Data":"f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553"} Oct 06 08:40:04 crc kubenswrapper[4991]: I1006 08:40:04.221047 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"815c282e-cc40-4ff8-b3f8-155d9a91a20b","Type":"ContainerStarted","Data":"47090eb349924642f543c04a66d3390950a21742c182060091cfbb40d99efe76"} Oct 06 08:40:04 crc kubenswrapper[4991]: I1006 08:40:04.232697 4991 generic.go:334] "Generic (PLEG): container finished" podID="0254b022-c378-4fff-bc49-15778c28e8e0" containerID="b418bee21a0d9a97cc7e25f384e4781bb8a1da9561b86a4b569400fcf968b49c" exitCode=0 Oct 06 08:40:04 crc kubenswrapper[4991]: I1006 08:40:04.232804 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z229c" event={"ID":"0254b022-c378-4fff-bc49-15778c28e8e0","Type":"ContainerDied","Data":"b418bee21a0d9a97cc7e25f384e4781bb8a1da9561b86a4b569400fcf968b49c"} Oct 06 08:40:04 crc kubenswrapper[4991]: I1006 08:40:04.243804 4991 generic.go:334] "Generic (PLEG): container finished" podID="5d55f5bf-87aa-4993-a295-05b740129150" containerID="5495f389c5daa6ac8b781c1f5e42e30df351043da6778baf343002e37f3cee49" exitCode=0 Oct 06 08:40:04 crc kubenswrapper[4991]: I1006 08:40:04.243908 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m72rm" event={"ID":"5d55f5bf-87aa-4993-a295-05b740129150","Type":"ContainerDied","Data":"5495f389c5daa6ac8b781c1f5e42e30df351043da6778baf343002e37f3cee49"} Oct 06 08:40:04 crc kubenswrapper[4991]: I1006 08:40:04.376366 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:40:04 crc kubenswrapper[4991]: W1006 08:40:04.410258 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa57b1fb_c743_4137_9501_a0110f385b1c.slice/crio-9230720e81af75feaabd54c331218446399e1d16122ab695c8e393b09db8037b WatchSource:0}: Error finding container 9230720e81af75feaabd54c331218446399e1d16122ab695c8e393b09db8037b: Status 404 returned error can't find the container with id 9230720e81af75feaabd54c331218446399e1d16122ab695c8e393b09db8037b Oct 06 08:40:04 crc kubenswrapper[4991]: I1006 08:40:04.683783 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6pm4v" Oct 06 08:40:04 crc kubenswrapper[4991]: I1006 08:40:04.776318 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxts7\" (UniqueName: \"kubernetes.io/projected/2f29fae6-0696-4c14-8b68-94c800349ada-kube-api-access-vxts7\") pod \"2f29fae6-0696-4c14-8b68-94c800349ada\" (UID: \"2f29fae6-0696-4c14-8b68-94c800349ada\") " Oct 06 08:40:04 crc kubenswrapper[4991]: I1006 08:40:04.783442 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f29fae6-0696-4c14-8b68-94c800349ada-kube-api-access-vxts7" (OuterVolumeSpecName: "kube-api-access-vxts7") pod "2f29fae6-0696-4c14-8b68-94c800349ada" (UID: "2f29fae6-0696-4c14-8b68-94c800349ada"). InnerVolumeSpecName "kube-api-access-vxts7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:04 crc kubenswrapper[4991]: I1006 08:40:04.878314 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxts7\" (UniqueName: \"kubernetes.io/projected/2f29fae6-0696-4c14-8b68-94c800349ada-kube-api-access-vxts7\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.217163 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.272926 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd747a7-d54f-46ac-9bde-b887a0450f66" path="/var/lib/kubelet/pods/dbd747a7-d54f-46ac-9bde-b887a0450f66/volumes" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.283395 4991 generic.go:334] "Generic (PLEG): container finished" podID="217a269d-973f-4e3f-bd2c-a057fb4c1525" containerID="a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872" exitCode=0 Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.283568 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.283819 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.283849 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"815c282e-cc40-4ff8-b3f8-155d9a91a20b","Type":"ContainerStarted","Data":"4e08aae5f1f3064fd06a75855d7641f5f9a9574da5cd200704d0371193acd2b3"} Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.283865 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"217a269d-973f-4e3f-bd2c-a057fb4c1525","Type":"ContainerDied","Data":"a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872"} Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.283885 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"217a269d-973f-4e3f-bd2c-a057fb4c1525","Type":"ContainerDied","Data":"87f41283332689b70a0866f40fbe265e5f9aa577470ced45afe62f90c55f5eec"} Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.283928 4991 scope.go:117] "RemoveContainer" containerID="20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.290260 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6pm4v" event={"ID":"2f29fae6-0696-4c14-8b68-94c800349ada","Type":"ContainerDied","Data":"88898a1ecb21eb8a11deb944b33f0cc14f3b743865da5b0c3a2f881368825c9f"} Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.290288 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88898a1ecb21eb8a11deb944b33f0cc14f3b743865da5b0c3a2f881368825c9f" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.290403 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6pm4v" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.306982 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa57b1fb-c743-4137-9501-a0110f385b1c","Type":"ContainerStarted","Data":"9230720e81af75feaabd54c331218446399e1d16122ab695c8e393b09db8037b"} Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.308112 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=10.308097607 podStartE2EDuration="10.308097607s" podCreationTimestamp="2025-10-06 08:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:05.296566265 +0000 UTC m=+1257.034316286" watchObservedRunningTime="2025-10-06 08:40:05.308097607 +0000 UTC m=+1257.045847628" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.317887 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f00c734e-514a-472e-8c2d-5adacef0d316","Type":"ContainerStarted","Data":"0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da"} Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.330073 4991 scope.go:117] "RemoveContainer" containerID="a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.387897 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-scripts\") pod \"217a269d-973f-4e3f-bd2c-a057fb4c1525\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.388191 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-config-data-custom\") pod \"217a269d-973f-4e3f-bd2c-a057fb4c1525\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.388320 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-config-data\") pod \"217a269d-973f-4e3f-bd2c-a057fb4c1525\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.388396 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/217a269d-973f-4e3f-bd2c-a057fb4c1525-etc-machine-id\") pod \"217a269d-973f-4e3f-bd2c-a057fb4c1525\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.388459 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/217a269d-973f-4e3f-bd2c-a057fb4c1525-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "217a269d-973f-4e3f-bd2c-a057fb4c1525" (UID: "217a269d-973f-4e3f-bd2c-a057fb4c1525"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.388495 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxxj5\" (UniqueName: \"kubernetes.io/projected/217a269d-973f-4e3f-bd2c-a057fb4c1525-kube-api-access-bxxj5\") pod \"217a269d-973f-4e3f-bd2c-a057fb4c1525\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.388567 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-combined-ca-bundle\") pod \"217a269d-973f-4e3f-bd2c-a057fb4c1525\" (UID: \"217a269d-973f-4e3f-bd2c-a057fb4c1525\") " Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.389194 4991 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/217a269d-973f-4e3f-bd2c-a057fb4c1525-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.391231 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-scripts" (OuterVolumeSpecName: "scripts") pod "217a269d-973f-4e3f-bd2c-a057fb4c1525" (UID: "217a269d-973f-4e3f-bd2c-a057fb4c1525"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.391520 4991 scope.go:117] "RemoveContainer" containerID="20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.392677 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "217a269d-973f-4e3f-bd2c-a057fb4c1525" (UID: "217a269d-973f-4e3f-bd2c-a057fb4c1525"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:05 crc kubenswrapper[4991]: E1006 08:40:05.394856 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972\": container with ID starting with 20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972 not found: ID does not exist" containerID="20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.394920 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972"} err="failed to get container status \"20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972\": rpc error: code = NotFound desc = could not find container \"20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972\": container with ID starting with 20c1530d1c0d8ee8dcc3f757923fad11aaf4afcee6a587ea165f722d63cb9972 not found: ID does not exist" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.394948 4991 scope.go:117] "RemoveContainer" containerID="a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872" Oct 06 08:40:05 crc kubenswrapper[4991]: E1006 08:40:05.395246 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872\": container with ID starting with a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872 not found: ID does not exist" containerID="a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.395338 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872"} err="failed to get container status \"a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872\": rpc error: code = NotFound desc = could not find container \"a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872\": container with ID starting with a0c77bc9b6e6dd1adb992cc7332c817603a6c7c8431c5f4f13d26a09b63ab872 not found: ID does not exist" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.404481 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217a269d-973f-4e3f-bd2c-a057fb4c1525-kube-api-access-bxxj5" (OuterVolumeSpecName: "kube-api-access-bxxj5") pod "217a269d-973f-4e3f-bd2c-a057fb4c1525" (UID: "217a269d-973f-4e3f-bd2c-a057fb4c1525"). InnerVolumeSpecName "kube-api-access-bxxj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.449199 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "217a269d-973f-4e3f-bd2c-a057fb4c1525" (UID: "217a269d-973f-4e3f-bd2c-a057fb4c1525"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.491577 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.491613 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.491623 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.491631 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxxj5\" (UniqueName: \"kubernetes.io/projected/217a269d-973f-4e3f-bd2c-a057fb4c1525-kube-api-access-bxxj5\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.539656 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-config-data" (OuterVolumeSpecName: "config-data") pod "217a269d-973f-4e3f-bd2c-a057fb4c1525" (UID: "217a269d-973f-4e3f-bd2c-a057fb4c1525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.594208 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/217a269d-973f-4e3f-bd2c-a057fb4c1525-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.649359 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.685079 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.696747 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:40:05 crc kubenswrapper[4991]: E1006 08:40:05.697176 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217a269d-973f-4e3f-bd2c-a057fb4c1525" containerName="cinder-scheduler" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.697200 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="217a269d-973f-4e3f-bd2c-a057fb4c1525" containerName="cinder-scheduler" Oct 06 08:40:05 crc kubenswrapper[4991]: E1006 08:40:05.697224 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f29fae6-0696-4c14-8b68-94c800349ada" containerName="mariadb-database-create" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.697232 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f29fae6-0696-4c14-8b68-94c800349ada" containerName="mariadb-database-create" Oct 06 08:40:05 crc kubenswrapper[4991]: E1006 08:40:05.697261 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217a269d-973f-4e3f-bd2c-a057fb4c1525" containerName="probe" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.697269 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="217a269d-973f-4e3f-bd2c-a057fb4c1525" containerName="probe" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.697519 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f29fae6-0696-4c14-8b68-94c800349ada" containerName="mariadb-database-create" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.697542 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="217a269d-973f-4e3f-bd2c-a057fb4c1525" containerName="cinder-scheduler" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.697558 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="217a269d-973f-4e3f-bd2c-a057fb4c1525" containerName="probe" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.698659 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.707819 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.771737 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.840392 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-scripts\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.840630 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.842041 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dd2d34c-a29e-47b8-98b4-f75fffb11673-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.842072 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.842168 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dbrj\" (UniqueName: \"kubernetes.io/projected/4dd2d34c-a29e-47b8-98b4-f75fffb11673-kube-api-access-7dbrj\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.842253 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-config-data\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.976150 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-scripts\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.976177 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.976210 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dd2d34c-a29e-47b8-98b4-f75fffb11673-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.976224 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.976270 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dbrj\" (UniqueName: \"kubernetes.io/projected/4dd2d34c-a29e-47b8-98b4-f75fffb11673-kube-api-access-7dbrj\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.976321 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-config-data\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:05 crc kubenswrapper[4991]: I1006 08:40:05.988885 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dd2d34c-a29e-47b8-98b4-f75fffb11673-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.009717 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.011271 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.015175 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-config-data\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.022813 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-scripts\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.027729 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dbrj\" (UniqueName: \"kubernetes.io/projected/4dd2d34c-a29e-47b8-98b4-f75fffb11673-kube-api-access-7dbrj\") pod \"cinder-scheduler-0\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " pod="openstack/cinder-scheduler-0" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.084912 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.121820 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z229c" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.223199 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m72rm" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.280388 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2pp5\" (UniqueName: \"kubernetes.io/projected/0254b022-c378-4fff-bc49-15778c28e8e0-kube-api-access-c2pp5\") pod \"0254b022-c378-4fff-bc49-15778c28e8e0\" (UID: \"0254b022-c378-4fff-bc49-15778c28e8e0\") " Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.287583 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0254b022-c378-4fff-bc49-15778c28e8e0-kube-api-access-c2pp5" (OuterVolumeSpecName: "kube-api-access-c2pp5") pod "0254b022-c378-4fff-bc49-15778c28e8e0" (UID: "0254b022-c378-4fff-bc49-15778c28e8e0"). InnerVolumeSpecName "kube-api-access-c2pp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.339681 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z229c" event={"ID":"0254b022-c378-4fff-bc49-15778c28e8e0","Type":"ContainerDied","Data":"d4eef865d2ac3288819050f2dcf578c24854bc19b1dc27148cb6cf4cfd24dfe9"} Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.339715 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4eef865d2ac3288819050f2dcf578c24854bc19b1dc27148cb6cf4cfd24dfe9" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.339944 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z229c" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.344183 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m72rm" event={"ID":"5d55f5bf-87aa-4993-a295-05b740129150","Type":"ContainerDied","Data":"6f082bbf3e1565940b74d18d50bbd5c48ddb65bd11017a0ddbb2d9ba6eb94b64"} Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.344217 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f082bbf3e1565940b74d18d50bbd5c48ddb65bd11017a0ddbb2d9ba6eb94b64" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.344453 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m72rm" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.348062 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa57b1fb-c743-4137-9501-a0110f385b1c","Type":"ContainerStarted","Data":"f83ef24bc48b9f3df4545258f80864d16d673fc45ac88575e0e485addba7df62"} Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.350980 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f00c734e-514a-472e-8c2d-5adacef0d316","Type":"ContainerStarted","Data":"1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123"} Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.382076 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlg22\" (UniqueName: \"kubernetes.io/projected/5d55f5bf-87aa-4993-a295-05b740129150-kube-api-access-mlg22\") pod \"5d55f5bf-87aa-4993-a295-05b740129150\" (UID: \"5d55f5bf-87aa-4993-a295-05b740129150\") " Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.382651 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2pp5\" (UniqueName: \"kubernetes.io/projected/0254b022-c378-4fff-bc49-15778c28e8e0-kube-api-access-c2pp5\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.389488 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d55f5bf-87aa-4993-a295-05b740129150-kube-api-access-mlg22" (OuterVolumeSpecName: "kube-api-access-mlg22") pod "5d55f5bf-87aa-4993-a295-05b740129150" (UID: "5d55f5bf-87aa-4993-a295-05b740129150"). InnerVolumeSpecName "kube-api-access-mlg22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.484830 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlg22\" (UniqueName: \"kubernetes.io/projected/5d55f5bf-87aa-4993-a295-05b740129150-kube-api-access-mlg22\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:06 crc kubenswrapper[4991]: W1006 08:40:06.632175 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dd2d34c_a29e_47b8_98b4_f75fffb11673.slice/crio-01ff58e29358c6ee8636d1f23f1014260c33e2cf51712714c57202fc6db62ffa WatchSource:0}: Error finding container 01ff58e29358c6ee8636d1f23f1014260c33e2cf51712714c57202fc6db62ffa: Status 404 returned error can't find the container with id 01ff58e29358c6ee8636d1f23f1014260c33e2cf51712714c57202fc6db62ffa Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.636119 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:40:06 crc kubenswrapper[4991]: I1006 08:40:06.812401 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.005730 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.005954 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f64668d5-09ed-4843-a117-ed6a3ae0d2ee" containerName="glance-log" containerID="cri-o://e6dbd55d04dccf656cb66469e6a3b3b8f6ada4a86993ee092d6bde6b3393ea11" gracePeriod=30 Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.006167 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f64668d5-09ed-4843-a117-ed6a3ae0d2ee" containerName="glance-httpd" containerID="cri-o://43699468a2106f54411eb7bcbafd2db8fadc0b2780109916319a4caf5809fc7e" gracePeriod=30 Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.254715 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217a269d-973f-4e3f-bd2c-a057fb4c1525" path="/var/lib/kubelet/pods/217a269d-973f-4e3f-bd2c-a057fb4c1525/volumes" Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.361926 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f00c734e-514a-472e-8c2d-5adacef0d316","Type":"ContainerStarted","Data":"d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf"} Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.362112 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="ceilometer-central-agent" containerID="cri-o://f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553" gracePeriod=30 Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.362443 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.362767 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="proxy-httpd" containerID="cri-o://d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf" gracePeriod=30 Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.362816 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="sg-core" containerID="cri-o://1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123" gracePeriod=30 Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.362853 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="ceilometer-notification-agent" containerID="cri-o://0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da" gracePeriod=30 Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.367094 4991 generic.go:334] "Generic (PLEG): container finished" podID="f64668d5-09ed-4843-a117-ed6a3ae0d2ee" containerID="e6dbd55d04dccf656cb66469e6a3b3b8f6ada4a86993ee092d6bde6b3393ea11" exitCode=143 Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.367149 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f64668d5-09ed-4843-a117-ed6a3ae0d2ee","Type":"ContainerDied","Data":"e6dbd55d04dccf656cb66469e6a3b3b8f6ada4a86993ee092d6bde6b3393ea11"} Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.376435 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4dd2d34c-a29e-47b8-98b4-f75fffb11673","Type":"ContainerStarted","Data":"01ff58e29358c6ee8636d1f23f1014260c33e2cf51712714c57202fc6db62ffa"} Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.381478 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa57b1fb-c743-4137-9501-a0110f385b1c","Type":"ContainerStarted","Data":"9f7dc8083673fb521af061c8df5ca04354332444376a940728267b2a54832c2d"} Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.389402 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.595956986 podStartE2EDuration="11.38938212s" podCreationTimestamp="2025-10-06 08:39:56 +0000 UTC" firstStartedPulling="2025-10-06 08:40:02.588668573 +0000 UTC m=+1254.326418594" lastFinishedPulling="2025-10-06 08:40:06.382093707 +0000 UTC m=+1258.119843728" observedRunningTime="2025-10-06 08:40:07.385784879 +0000 UTC m=+1259.123534910" watchObservedRunningTime="2025-10-06 08:40:07.38938212 +0000 UTC m=+1259.127132141" Oct 06 08:40:07 crc kubenswrapper[4991]: I1006 08:40:07.439546 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.439526574 podStartE2EDuration="4.439526574s" podCreationTimestamp="2025-10-06 08:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:07.431130809 +0000 UTC m=+1259.168880840" watchObservedRunningTime="2025-10-06 08:40:07.439526574 +0000 UTC m=+1259.177276595" Oct 06 08:40:08 crc kubenswrapper[4991]: I1006 08:40:08.406002 4991 generic.go:334] "Generic (PLEG): container finished" podID="f00c734e-514a-472e-8c2d-5adacef0d316" containerID="d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf" exitCode=0 Oct 06 08:40:08 crc kubenswrapper[4991]: I1006 08:40:08.406252 4991 generic.go:334] "Generic (PLEG): container finished" podID="f00c734e-514a-472e-8c2d-5adacef0d316" containerID="1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123" exitCode=2 Oct 06 08:40:08 crc kubenswrapper[4991]: I1006 08:40:08.406262 4991 generic.go:334] "Generic (PLEG): container finished" podID="f00c734e-514a-472e-8c2d-5adacef0d316" containerID="0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da" exitCode=0 Oct 06 08:40:08 crc kubenswrapper[4991]: I1006 08:40:08.406119 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f00c734e-514a-472e-8c2d-5adacef0d316","Type":"ContainerDied","Data":"d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf"} Oct 06 08:40:08 crc kubenswrapper[4991]: I1006 08:40:08.406353 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f00c734e-514a-472e-8c2d-5adacef0d316","Type":"ContainerDied","Data":"1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123"} Oct 06 08:40:08 crc kubenswrapper[4991]: I1006 08:40:08.406372 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f00c734e-514a-472e-8c2d-5adacef0d316","Type":"ContainerDied","Data":"0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da"} Oct 06 08:40:08 crc kubenswrapper[4991]: I1006 08:40:08.411456 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4dd2d34c-a29e-47b8-98b4-f75fffb11673","Type":"ContainerStarted","Data":"d649d548626a4bd3bff872429af0bef8f3a02f2808a38286a2013d34229a5407"} Oct 06 08:40:08 crc kubenswrapper[4991]: I1006 08:40:08.411499 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4dd2d34c-a29e-47b8-98b4-f75fffb11673","Type":"ContainerStarted","Data":"0d1610527cf8b6f50326a4d6ebe66a1e52c2dc1024a98e810b007cc8199eb0b7"} Oct 06 08:40:08 crc kubenswrapper[4991]: I1006 08:40:08.447123 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.4471068750000002 podStartE2EDuration="3.447106875s" podCreationTimestamp="2025-10-06 08:40:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:08.443660368 +0000 UTC m=+1260.181410379" watchObservedRunningTime="2025-10-06 08:40:08.447106875 +0000 UTC m=+1260.184856896" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.019812 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.147246 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f00c734e-514a-472e-8c2d-5adacef0d316-run-httpd\") pod \"f00c734e-514a-472e-8c2d-5adacef0d316\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.147393 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f00c734e-514a-472e-8c2d-5adacef0d316-log-httpd\") pod \"f00c734e-514a-472e-8c2d-5adacef0d316\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.147425 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-combined-ca-bundle\") pod \"f00c734e-514a-472e-8c2d-5adacef0d316\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.147482 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-scripts\") pod \"f00c734e-514a-472e-8c2d-5adacef0d316\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.147704 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00c734e-514a-472e-8c2d-5adacef0d316-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f00c734e-514a-472e-8c2d-5adacef0d316" (UID: "f00c734e-514a-472e-8c2d-5adacef0d316"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.147901 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-sg-core-conf-yaml\") pod \"f00c734e-514a-472e-8c2d-5adacef0d316\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.147975 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f00c734e-514a-472e-8c2d-5adacef0d316-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f00c734e-514a-472e-8c2d-5adacef0d316" (UID: "f00c734e-514a-472e-8c2d-5adacef0d316"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.148363 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-config-data\") pod \"f00c734e-514a-472e-8c2d-5adacef0d316\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.148524 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqwjk\" (UniqueName: \"kubernetes.io/projected/f00c734e-514a-472e-8c2d-5adacef0d316-kube-api-access-tqwjk\") pod \"f00c734e-514a-472e-8c2d-5adacef0d316\" (UID: \"f00c734e-514a-472e-8c2d-5adacef0d316\") " Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.149902 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f00c734e-514a-472e-8c2d-5adacef0d316-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.149994 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f00c734e-514a-472e-8c2d-5adacef0d316-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.167463 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-scripts" (OuterVolumeSpecName: "scripts") pod "f00c734e-514a-472e-8c2d-5adacef0d316" (UID: "f00c734e-514a-472e-8c2d-5adacef0d316"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.170552 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f00c734e-514a-472e-8c2d-5adacef0d316-kube-api-access-tqwjk" (OuterVolumeSpecName: "kube-api-access-tqwjk") pod "f00c734e-514a-472e-8c2d-5adacef0d316" (UID: "f00c734e-514a-472e-8c2d-5adacef0d316"). InnerVolumeSpecName "kube-api-access-tqwjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.199073 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f00c734e-514a-472e-8c2d-5adacef0d316" (UID: "f00c734e-514a-472e-8c2d-5adacef0d316"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.251372 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f00c734e-514a-472e-8c2d-5adacef0d316" (UID: "f00c734e-514a-472e-8c2d-5adacef0d316"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.252071 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.252095 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.252105 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqwjk\" (UniqueName: \"kubernetes.io/projected/f00c734e-514a-472e-8c2d-5adacef0d316-kube-api-access-tqwjk\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.252114 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.282258 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-config-data" (OuterVolumeSpecName: "config-data") pod "f00c734e-514a-472e-8c2d-5adacef0d316" (UID: "f00c734e-514a-472e-8c2d-5adacef0d316"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.354592 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f00c734e-514a-472e-8c2d-5adacef0d316-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.392773 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.447739 4991 generic.go:334] "Generic (PLEG): container finished" podID="f00c734e-514a-472e-8c2d-5adacef0d316" containerID="f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553" exitCode=0 Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.448374 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.448428 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f00c734e-514a-472e-8c2d-5adacef0d316","Type":"ContainerDied","Data":"f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553"} Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.448494 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f00c734e-514a-472e-8c2d-5adacef0d316","Type":"ContainerDied","Data":"b3ed5cfda5719139b37f37623e4c30e7b391c027fe6721e80dd38d13dca848a5"} Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.448518 4991 scope.go:117] "RemoveContainer" containerID="d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.489167 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f9885cd76-4cxdt"] Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.489440 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f9885cd76-4cxdt" podUID="e9816fde-c4d0-4c01-8d09-2af0f4256fd1" containerName="neutron-api" containerID="cri-o://361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa" gracePeriod=30 Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.489476 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f9885cd76-4cxdt" podUID="e9816fde-c4d0-4c01-8d09-2af0f4256fd1" containerName="neutron-httpd" containerID="cri-o://46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323" gracePeriod=30 Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.523856 4991 scope.go:117] "RemoveContainer" containerID="1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.555194 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.578434 4991 scope.go:117] "RemoveContainer" containerID="0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.578665 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.589473 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:09 crc kubenswrapper[4991]: E1006 08:40:09.590220 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d55f5bf-87aa-4993-a295-05b740129150" containerName="mariadb-database-create" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.590354 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d55f5bf-87aa-4993-a295-05b740129150" containerName="mariadb-database-create" Oct 06 08:40:09 crc kubenswrapper[4991]: E1006 08:40:09.590434 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="sg-core" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.590546 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="sg-core" Oct 06 08:40:09 crc kubenswrapper[4991]: E1006 08:40:09.590634 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="ceilometer-central-agent" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.590702 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="ceilometer-central-agent" Oct 06 08:40:09 crc kubenswrapper[4991]: E1006 08:40:09.590783 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="ceilometer-notification-agent" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.590850 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="ceilometer-notification-agent" Oct 06 08:40:09 crc kubenswrapper[4991]: E1006 08:40:09.590925 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="proxy-httpd" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.590987 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="proxy-httpd" Oct 06 08:40:09 crc kubenswrapper[4991]: E1006 08:40:09.591054 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0254b022-c378-4fff-bc49-15778c28e8e0" containerName="mariadb-database-create" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.591132 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0254b022-c378-4fff-bc49-15778c28e8e0" containerName="mariadb-database-create" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.591432 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="proxy-httpd" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.591551 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="ceilometer-notification-agent" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.591637 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="sg-core" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.591713 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="0254b022-c378-4fff-bc49-15778c28e8e0" containerName="mariadb-database-create" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.591776 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" containerName="ceilometer-central-agent" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.591846 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d55f5bf-87aa-4993-a295-05b740129150" containerName="mariadb-database-create" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.608044 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.608238 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.612895 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.613061 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.613218 4991 scope.go:117] "RemoveContainer" containerID="f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.641080 4991 scope.go:117] "RemoveContainer" containerID="d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf" Oct 06 08:40:09 crc kubenswrapper[4991]: E1006 08:40:09.641508 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf\": container with ID starting with d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf not found: ID does not exist" containerID="d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.641606 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf"} err="failed to get container status \"d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf\": rpc error: code = NotFound desc = could not find container \"d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf\": container with ID starting with d46bf2ccb1b77e25f51613502d52e2a7a58891df1cc6d445bc8c0ca1037b43bf not found: ID does not exist" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.641727 4991 scope.go:117] "RemoveContainer" containerID="1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123" Oct 06 08:40:09 crc kubenswrapper[4991]: E1006 08:40:09.642126 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123\": container with ID starting with 1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123 not found: ID does not exist" containerID="1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.642228 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123"} err="failed to get container status \"1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123\": rpc error: code = NotFound desc = could not find container \"1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123\": container with ID starting with 1f37c547a7f40a8610e746756993042b2d14c3bdc3df0be7ab35a4993b14a123 not found: ID does not exist" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.642362 4991 scope.go:117] "RemoveContainer" containerID="0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da" Oct 06 08:40:09 crc kubenswrapper[4991]: E1006 08:40:09.642709 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da\": container with ID starting with 0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da not found: ID does not exist" containerID="0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.642814 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da"} err="failed to get container status \"0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da\": rpc error: code = NotFound desc = could not find container \"0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da\": container with ID starting with 0926fa39714dcf10891e88318a2ab9765bec6100963f9243df974f67934ae1da not found: ID does not exist" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.642902 4991 scope.go:117] "RemoveContainer" containerID="f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553" Oct 06 08:40:09 crc kubenswrapper[4991]: E1006 08:40:09.643195 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553\": container with ID starting with f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553 not found: ID does not exist" containerID="f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.643280 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553"} err="failed to get container status \"f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553\": rpc error: code = NotFound desc = could not find container \"f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553\": container with ID starting with f251bd2f26aa51293a5927e325a1acd65bf3b1f7225814d10c6efc489d060553 not found: ID does not exist" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.778287 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.778735 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-config-data\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.778794 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62e0237c-d25f-40ce-9752-5f9605d61912-run-httpd\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.778831 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdb7k\" (UniqueName: \"kubernetes.io/projected/62e0237c-d25f-40ce-9752-5f9605d61912-kube-api-access-zdb7k\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.778881 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-scripts\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.778927 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.778946 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62e0237c-d25f-40ce-9752-5f9605d61912-log-httpd\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.879925 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdb7k\" (UniqueName: \"kubernetes.io/projected/62e0237c-d25f-40ce-9752-5f9605d61912-kube-api-access-zdb7k\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.880340 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-scripts\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.880749 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.881402 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62e0237c-d25f-40ce-9752-5f9605d61912-log-httpd\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.881577 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.881674 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-config-data\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.881865 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62e0237c-d25f-40ce-9752-5f9605d61912-log-httpd\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.882443 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62e0237c-d25f-40ce-9752-5f9605d61912-run-httpd\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.882603 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62e0237c-d25f-40ce-9752-5f9605d61912-run-httpd\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.887619 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-config-data\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.889878 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-scripts\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.890186 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.896030 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.900026 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdb7k\" (UniqueName: \"kubernetes.io/projected/62e0237c-d25f-40ce-9752-5f9605d61912-kube-api-access-zdb7k\") pod \"ceilometer-0\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " pod="openstack/ceilometer-0" Oct 06 08:40:09 crc kubenswrapper[4991]: I1006 08:40:09.940663 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.469111 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.476998 4991 generic.go:334] "Generic (PLEG): container finished" podID="f64668d5-09ed-4843-a117-ed6a3ae0d2ee" containerID="43699468a2106f54411eb7bcbafd2db8fadc0b2780109916319a4caf5809fc7e" exitCode=0 Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.477116 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f64668d5-09ed-4843-a117-ed6a3ae0d2ee","Type":"ContainerDied","Data":"43699468a2106f54411eb7bcbafd2db8fadc0b2780109916319a4caf5809fc7e"} Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.483423 4991 generic.go:334] "Generic (PLEG): container finished" podID="e9816fde-c4d0-4c01-8d09-2af0f4256fd1" containerID="46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323" exitCode=0 Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.483498 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f9885cd76-4cxdt" event={"ID":"e9816fde-c4d0-4c01-8d09-2af0f4256fd1","Type":"ContainerDied","Data":"46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323"} Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.644413 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-9279-account-create-qgf7w"] Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.647877 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9279-account-create-qgf7w" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.651494 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.667628 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9279-account-create-qgf7w"] Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.687803 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.800991 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.801318 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-config-data\") pod \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.801397 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcflh\" (UniqueName: \"kubernetes.io/projected/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-kube-api-access-bcflh\") pod \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.801457 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-logs\") pod \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.801478 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-httpd-run\") pod \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.801509 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-scripts\") pod \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.801564 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-combined-ca-bundle\") pod \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.801829 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-public-tls-certs\") pod \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\" (UID: \"f64668d5-09ed-4843-a117-ed6a3ae0d2ee\") " Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.802022 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f64668d5-09ed-4843-a117-ed6a3ae0d2ee" (UID: "f64668d5-09ed-4843-a117-ed6a3ae0d2ee"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.802116 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk7wn\" (UniqueName: \"kubernetes.io/projected/735f5180-6fe2-4632-b321-f6c96f3c9400-kube-api-access-gk7wn\") pod \"nova-api-9279-account-create-qgf7w\" (UID: \"735f5180-6fe2-4632-b321-f6c96f3c9400\") " pod="openstack/nova-api-9279-account-create-qgf7w" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.802249 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.805454 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-logs" (OuterVolumeSpecName: "logs") pod "f64668d5-09ed-4843-a117-ed6a3ae0d2ee" (UID: "f64668d5-09ed-4843-a117-ed6a3ae0d2ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.808393 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-kube-api-access-bcflh" (OuterVolumeSpecName: "kube-api-access-bcflh") pod "f64668d5-09ed-4843-a117-ed6a3ae0d2ee" (UID: "f64668d5-09ed-4843-a117-ed6a3ae0d2ee"). InnerVolumeSpecName "kube-api-access-bcflh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.810430 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "f64668d5-09ed-4843-a117-ed6a3ae0d2ee" (UID: "f64668d5-09ed-4843-a117-ed6a3ae0d2ee"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.813413 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-scripts" (OuterVolumeSpecName: "scripts") pod "f64668d5-09ed-4843-a117-ed6a3ae0d2ee" (UID: "f64668d5-09ed-4843-a117-ed6a3ae0d2ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.835083 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f64668d5-09ed-4843-a117-ed6a3ae0d2ee" (UID: "f64668d5-09ed-4843-a117-ed6a3ae0d2ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.839357 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a45f-account-create-n424l"] Oct 06 08:40:10 crc kubenswrapper[4991]: E1006 08:40:10.839924 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64668d5-09ed-4843-a117-ed6a3ae0d2ee" containerName="glance-httpd" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.839946 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64668d5-09ed-4843-a117-ed6a3ae0d2ee" containerName="glance-httpd" Oct 06 08:40:10 crc kubenswrapper[4991]: E1006 08:40:10.840002 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64668d5-09ed-4843-a117-ed6a3ae0d2ee" containerName="glance-log" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.840011 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64668d5-09ed-4843-a117-ed6a3ae0d2ee" containerName="glance-log" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.840225 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64668d5-09ed-4843-a117-ed6a3ae0d2ee" containerName="glance-log" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.840253 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64668d5-09ed-4843-a117-ed6a3ae0d2ee" containerName="glance-httpd" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.841006 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a45f-account-create-n424l" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.844396 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.861655 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a45f-account-create-n424l"] Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.883477 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f64668d5-09ed-4843-a117-ed6a3ae0d2ee" (UID: "f64668d5-09ed-4843-a117-ed6a3ae0d2ee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.903376 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk7wn\" (UniqueName: \"kubernetes.io/projected/735f5180-6fe2-4632-b321-f6c96f3c9400-kube-api-access-gk7wn\") pod \"nova-api-9279-account-create-qgf7w\" (UID: \"735f5180-6fe2-4632-b321-f6c96f3c9400\") " pod="openstack/nova-api-9279-account-create-qgf7w" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.903586 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.903600 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.903625 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.903640 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcflh\" (UniqueName: \"kubernetes.io/projected/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-kube-api-access-bcflh\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.903657 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.903669 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.910469 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-config-data" (OuterVolumeSpecName: "config-data") pod "f64668d5-09ed-4843-a117-ed6a3ae0d2ee" (UID: "f64668d5-09ed-4843-a117-ed6a3ae0d2ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.925213 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk7wn\" (UniqueName: \"kubernetes.io/projected/735f5180-6fe2-4632-b321-f6c96f3c9400-kube-api-access-gk7wn\") pod \"nova-api-9279-account-create-qgf7w\" (UID: \"735f5180-6fe2-4632-b321-f6c96f3c9400\") " pod="openstack/nova-api-9279-account-create-qgf7w" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.929570 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 06 08:40:10 crc kubenswrapper[4991]: I1006 08:40:10.984257 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9279-account-create-qgf7w" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.005714 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqdzj\" (UniqueName: \"kubernetes.io/projected/c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b-kube-api-access-mqdzj\") pod \"nova-cell0-a45f-account-create-n424l\" (UID: \"c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b\") " pod="openstack/nova-cell0-a45f-account-create-n424l" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.005908 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.005924 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f64668d5-09ed-4843-a117-ed6a3ae0d2ee-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.089413 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.107354 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqdzj\" (UniqueName: \"kubernetes.io/projected/c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b-kube-api-access-mqdzj\") pod \"nova-cell0-a45f-account-create-n424l\" (UID: \"c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b\") " pod="openstack/nova-cell0-a45f-account-create-n424l" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.125751 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqdzj\" (UniqueName: \"kubernetes.io/projected/c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b-kube-api-access-mqdzj\") pod \"nova-cell0-a45f-account-create-n424l\" (UID: \"c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b\") " pod="openstack/nova-cell0-a45f-account-create-n424l" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.162654 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a45f-account-create-n424l" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.256712 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f00c734e-514a-472e-8c2d-5adacef0d316" path="/var/lib/kubelet/pods/f00c734e-514a-472e-8c2d-5adacef0d316/volumes" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.503387 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f64668d5-09ed-4843-a117-ed6a3ae0d2ee","Type":"ContainerDied","Data":"a2a6e712ac43e35d206bfa4081069a2b3747825540f3c6c13a96fca1c73ff32e"} Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.503713 4991 scope.go:117] "RemoveContainer" containerID="43699468a2106f54411eb7bcbafd2db8fadc0b2780109916319a4caf5809fc7e" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.503850 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.507022 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9279-account-create-qgf7w"] Oct 06 08:40:11 crc kubenswrapper[4991]: W1006 08:40:11.510280 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod735f5180_6fe2_4632_b321_f6c96f3c9400.slice/crio-04e367b997ec9a99e3a1f035d3f96969fe39251183dce83bc6fcbb01f6d1c03e WatchSource:0}: Error finding container 04e367b997ec9a99e3a1f035d3f96969fe39251183dce83bc6fcbb01f6d1c03e: Status 404 returned error can't find the container with id 04e367b997ec9a99e3a1f035d3f96969fe39251183dce83bc6fcbb01f6d1c03e Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.510540 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62e0237c-d25f-40ce-9752-5f9605d61912","Type":"ContainerStarted","Data":"d31707e60eb3ba97e7046bb66b77143cdb7b5c0f21ecb38d35bca6a020f6dacc"} Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.510576 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62e0237c-d25f-40ce-9752-5f9605d61912","Type":"ContainerStarted","Data":"a5f5b2fdc64c0fbabbac848bd69b5d8b191294c2f3d031a18433a900bc6915d9"} Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.534353 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.544828 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.568528 4991 scope.go:117] "RemoveContainer" containerID="e6dbd55d04dccf656cb66469e6a3b3b8f6ada4a86993ee092d6bde6b3393ea11" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.599352 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.601315 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.605714 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.605884 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.650354 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.723259 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.723375 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1a24973-6ef6-4732-9a96-040ce646a707-logs\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.723399 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjp9s\" (UniqueName: \"kubernetes.io/projected/d1a24973-6ef6-4732-9a96-040ce646a707-kube-api-access-tjp9s\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.723416 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1a24973-6ef6-4732-9a96-040ce646a707-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.723445 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-scripts\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.723515 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-config-data\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.723535 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.723561 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.779566 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a45f-account-create-n424l"] Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.825135 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1a24973-6ef6-4732-9a96-040ce646a707-logs\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.825196 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjp9s\" (UniqueName: \"kubernetes.io/projected/d1a24973-6ef6-4732-9a96-040ce646a707-kube-api-access-tjp9s\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.825217 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1a24973-6ef6-4732-9a96-040ce646a707-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.825259 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-scripts\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.825352 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-config-data\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.825387 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.825415 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.825445 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.834273 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.838148 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-scripts\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.838652 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1a24973-6ef6-4732-9a96-040ce646a707-logs\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.839400 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-config-data\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.839406 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1a24973-6ef6-4732-9a96-040ce646a707-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.839801 4991 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.851726 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.864610 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjp9s\" (UniqueName: \"kubernetes.io/projected/d1a24973-6ef6-4732-9a96-040ce646a707-kube-api-access-tjp9s\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.879326 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " pod="openstack/glance-default-external-api-0" Oct 06 08:40:11 crc kubenswrapper[4991]: I1006 08:40:11.951005 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 08:40:12 crc kubenswrapper[4991]: I1006 08:40:12.524162 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a45f-account-create-n424l" event={"ID":"c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b","Type":"ContainerStarted","Data":"448d92f50c90e335047076f51466baf5a63aec01c30cb258c80a14dc8b42453a"} Oct 06 08:40:12 crc kubenswrapper[4991]: I1006 08:40:12.524976 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a45f-account-create-n424l" event={"ID":"c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b","Type":"ContainerStarted","Data":"3162e5f0bed405a1fdb3dc5098e9dcd5fcb79431c3541faf1dc058d033c0fd6c"} Oct 06 08:40:12 crc kubenswrapper[4991]: I1006 08:40:12.533065 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62e0237c-d25f-40ce-9752-5f9605d61912","Type":"ContainerStarted","Data":"f88af68b518b78719f896c4c87f456eb4221665cf4334316f3fba55ada1c6f0e"} Oct 06 08:40:12 crc kubenswrapper[4991]: I1006 08:40:12.542856 4991 generic.go:334] "Generic (PLEG): container finished" podID="735f5180-6fe2-4632-b321-f6c96f3c9400" containerID="4adcd03b16369123fe98ce7c851a353189075e3043ebaad574d8063e0846582d" exitCode=0 Oct 06 08:40:12 crc kubenswrapper[4991]: I1006 08:40:12.542956 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9279-account-create-qgf7w" event={"ID":"735f5180-6fe2-4632-b321-f6c96f3c9400","Type":"ContainerDied","Data":"4adcd03b16369123fe98ce7c851a353189075e3043ebaad574d8063e0846582d"} Oct 06 08:40:12 crc kubenswrapper[4991]: I1006 08:40:12.542982 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9279-account-create-qgf7w" event={"ID":"735f5180-6fe2-4632-b321-f6c96f3c9400","Type":"ContainerStarted","Data":"04e367b997ec9a99e3a1f035d3f96969fe39251183dce83bc6fcbb01f6d1c03e"} Oct 06 08:40:12 crc kubenswrapper[4991]: I1006 08:40:12.558572 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-a45f-account-create-n424l" podStartSLOduration=2.55855161 podStartE2EDuration="2.55855161s" podCreationTimestamp="2025-10-06 08:40:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:12.552594864 +0000 UTC m=+1264.290344885" watchObservedRunningTime="2025-10-06 08:40:12.55855161 +0000 UTC m=+1264.296301631" Oct 06 08:40:12 crc kubenswrapper[4991]: I1006 08:40:12.675845 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:40:12 crc kubenswrapper[4991]: W1006 08:40:12.676451 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a24973_6ef6_4732_9a96_040ce646a707.slice/crio-7df79192d6f34a90ebfd5f7c39a7e4dc001ad60b7b016b0f55c1a72ada7c789b WatchSource:0}: Error finding container 7df79192d6f34a90ebfd5f7c39a7e4dc001ad60b7b016b0f55c1a72ada7c789b: Status 404 returned error can't find the container with id 7df79192d6f34a90ebfd5f7c39a7e4dc001ad60b7b016b0f55c1a72ada7c789b Oct 06 08:40:13 crc kubenswrapper[4991]: I1006 08:40:13.264055 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f64668d5-09ed-4843-a117-ed6a3ae0d2ee" path="/var/lib/kubelet/pods/f64668d5-09ed-4843-a117-ed6a3ae0d2ee/volumes" Oct 06 08:40:13 crc kubenswrapper[4991]: I1006 08:40:13.561182 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d1a24973-6ef6-4732-9a96-040ce646a707","Type":"ContainerStarted","Data":"2f29341e126502f19b2fe665eb6f63634e44f634ecd075749718883b8f004d5b"} Oct 06 08:40:13 crc kubenswrapper[4991]: I1006 08:40:13.562444 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d1a24973-6ef6-4732-9a96-040ce646a707","Type":"ContainerStarted","Data":"7df79192d6f34a90ebfd5f7c39a7e4dc001ad60b7b016b0f55c1a72ada7c789b"} Oct 06 08:40:13 crc kubenswrapper[4991]: I1006 08:40:13.566510 4991 generic.go:334] "Generic (PLEG): container finished" podID="c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b" containerID="448d92f50c90e335047076f51466baf5a63aec01c30cb258c80a14dc8b42453a" exitCode=0 Oct 06 08:40:13 crc kubenswrapper[4991]: I1006 08:40:13.566617 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a45f-account-create-n424l" event={"ID":"c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b","Type":"ContainerDied","Data":"448d92f50c90e335047076f51466baf5a63aec01c30cb258c80a14dc8b42453a"} Oct 06 08:40:13 crc kubenswrapper[4991]: I1006 08:40:13.579523 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62e0237c-d25f-40ce-9752-5f9605d61912","Type":"ContainerStarted","Data":"a593008fef4e009d1f98e807ec5f54e4172767d3892b84efb558b98bbcfbc94d"} Oct 06 08:40:13 crc kubenswrapper[4991]: I1006 08:40:13.598504 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:13 crc kubenswrapper[4991]: I1006 08:40:13.712595 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:13 crc kubenswrapper[4991]: I1006 08:40:13.714839 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:13 crc kubenswrapper[4991]: I1006 08:40:13.771648 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:13 crc kubenswrapper[4991]: I1006 08:40:13.772962 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:14 crc kubenswrapper[4991]: I1006 08:40:14.061848 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9279-account-create-qgf7w" Oct 06 08:40:14 crc kubenswrapper[4991]: I1006 08:40:14.176304 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk7wn\" (UniqueName: \"kubernetes.io/projected/735f5180-6fe2-4632-b321-f6c96f3c9400-kube-api-access-gk7wn\") pod \"735f5180-6fe2-4632-b321-f6c96f3c9400\" (UID: \"735f5180-6fe2-4632-b321-f6c96f3c9400\") " Oct 06 08:40:14 crc kubenswrapper[4991]: I1006 08:40:14.185568 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/735f5180-6fe2-4632-b321-f6c96f3c9400-kube-api-access-gk7wn" (OuterVolumeSpecName: "kube-api-access-gk7wn") pod "735f5180-6fe2-4632-b321-f6c96f3c9400" (UID: "735f5180-6fe2-4632-b321-f6c96f3c9400"). InnerVolumeSpecName "kube-api-access-gk7wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:14 crc kubenswrapper[4991]: I1006 08:40:14.191675 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 06 08:40:14 crc kubenswrapper[4991]: I1006 08:40:14.283780 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk7wn\" (UniqueName: \"kubernetes.io/projected/735f5180-6fe2-4632-b321-f6c96f3c9400-kube-api-access-gk7wn\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:14 crc kubenswrapper[4991]: I1006 08:40:14.595364 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9279-account-create-qgf7w" event={"ID":"735f5180-6fe2-4632-b321-f6c96f3c9400","Type":"ContainerDied","Data":"04e367b997ec9a99e3a1f035d3f96969fe39251183dce83bc6fcbb01f6d1c03e"} Oct 06 08:40:14 crc kubenswrapper[4991]: I1006 08:40:14.595416 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04e367b997ec9a99e3a1f035d3f96969fe39251183dce83bc6fcbb01f6d1c03e" Oct 06 08:40:14 crc kubenswrapper[4991]: I1006 08:40:14.595491 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9279-account-create-qgf7w" Oct 06 08:40:14 crc kubenswrapper[4991]: I1006 08:40:14.599836 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d1a24973-6ef6-4732-9a96-040ce646a707","Type":"ContainerStarted","Data":"4fbfc2abb485c8ccd9560493a1360ee31985544c8877ac7b1baa4f76139308c7"} Oct 06 08:40:14 crc kubenswrapper[4991]: I1006 08:40:14.600975 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:14 crc kubenswrapper[4991]: I1006 08:40:14.601014 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:14 crc kubenswrapper[4991]: I1006 08:40:14.634604 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.6345837359999997 podStartE2EDuration="3.634583736s" podCreationTimestamp="2025-10-06 08:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:14.627472477 +0000 UTC m=+1266.365222538" watchObservedRunningTime="2025-10-06 08:40:14.634583736 +0000 UTC m=+1266.372333757" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.147587 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a45f-account-create-n424l" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.200191 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqdzj\" (UniqueName: \"kubernetes.io/projected/c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b-kube-api-access-mqdzj\") pod \"c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b\" (UID: \"c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b\") " Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.225566 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b-kube-api-access-mqdzj" (OuterVolumeSpecName: "kube-api-access-mqdzj") pod "c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b" (UID: "c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b"). InnerVolumeSpecName "kube-api-access-mqdzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.302076 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqdzj\" (UniqueName: \"kubernetes.io/projected/c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b-kube-api-access-mqdzj\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.546417 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.615461 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctrl9\" (UniqueName: \"kubernetes.io/projected/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-kube-api-access-ctrl9\") pod \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.615736 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-config\") pod \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.615770 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-combined-ca-bundle\") pod \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.615807 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-ovndb-tls-certs\") pod \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.615865 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-httpd-config\") pod \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\" (UID: \"e9816fde-c4d0-4c01-8d09-2af0f4256fd1\") " Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.623666 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e9816fde-c4d0-4c01-8d09-2af0f4256fd1" (UID: "e9816fde-c4d0-4c01-8d09-2af0f4256fd1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.642730 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a45f-account-create-n424l" event={"ID":"c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b","Type":"ContainerDied","Data":"3162e5f0bed405a1fdb3dc5098e9dcd5fcb79431c3541faf1dc058d033c0fd6c"} Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.642773 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3162e5f0bed405a1fdb3dc5098e9dcd5fcb79431c3541faf1dc058d033c0fd6c" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.643157 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a45f-account-create-n424l" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.654985 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="ceilometer-central-agent" containerID="cri-o://d31707e60eb3ba97e7046bb66b77143cdb7b5c0f21ecb38d35bca6a020f6dacc" gracePeriod=30 Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.655274 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62e0237c-d25f-40ce-9752-5f9605d61912","Type":"ContainerStarted","Data":"07c82f86f5435e9ed2843b0557d338c2ca421a602305b8fc5422a8355702925f"} Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.655346 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.655593 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="proxy-httpd" containerID="cri-o://07c82f86f5435e9ed2843b0557d338c2ca421a602305b8fc5422a8355702925f" gracePeriod=30 Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.655649 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="ceilometer-notification-agent" containerID="cri-o://f88af68b518b78719f896c4c87f456eb4221665cf4334316f3fba55ada1c6f0e" gracePeriod=30 Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.655721 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="sg-core" containerID="cri-o://a593008fef4e009d1f98e807ec5f54e4172767d3892b84efb558b98bbcfbc94d" gracePeriod=30 Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.664592 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-kube-api-access-ctrl9" (OuterVolumeSpecName: "kube-api-access-ctrl9") pod "e9816fde-c4d0-4c01-8d09-2af0f4256fd1" (UID: "e9816fde-c4d0-4c01-8d09-2af0f4256fd1"). InnerVolumeSpecName "kube-api-access-ctrl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.690351 4991 generic.go:334] "Generic (PLEG): container finished" podID="e9816fde-c4d0-4c01-8d09-2af0f4256fd1" containerID="361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa" exitCode=0 Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.691367 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f9885cd76-4cxdt" event={"ID":"e9816fde-c4d0-4c01-8d09-2af0f4256fd1","Type":"ContainerDied","Data":"361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa"} Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.691437 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f9885cd76-4cxdt" event={"ID":"e9816fde-c4d0-4c01-8d09-2af0f4256fd1","Type":"ContainerDied","Data":"ee15d8b8c296c51105a86cd7c291466f6a124af85521f5e19b4cfe08181eba2c"} Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.691455 4991 scope.go:117] "RemoveContainer" containerID="46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.691627 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f9885cd76-4cxdt" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.707617 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.206153787 podStartE2EDuration="6.707582038s" podCreationTimestamp="2025-10-06 08:40:09 +0000 UTC" firstStartedPulling="2025-10-06 08:40:10.497469102 +0000 UTC m=+1262.235219123" lastFinishedPulling="2025-10-06 08:40:14.998897353 +0000 UTC m=+1266.736647374" observedRunningTime="2025-10-06 08:40:15.689826122 +0000 UTC m=+1267.427576143" watchObservedRunningTime="2025-10-06 08:40:15.707582038 +0000 UTC m=+1267.445332059" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.708492 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-config" (OuterVolumeSpecName: "config") pod "e9816fde-c4d0-4c01-8d09-2af0f4256fd1" (UID: "e9816fde-c4d0-4c01-8d09-2af0f4256fd1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.731921 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.732234 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.732343 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctrl9\" (UniqueName: \"kubernetes.io/projected/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-kube-api-access-ctrl9\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.738770 4991 scope.go:117] "RemoveContainer" containerID="361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.751519 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9816fde-c4d0-4c01-8d09-2af0f4256fd1" (UID: "e9816fde-c4d0-4c01-8d09-2af0f4256fd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.769445 4991 scope.go:117] "RemoveContainer" containerID="46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323" Oct 06 08:40:15 crc kubenswrapper[4991]: E1006 08:40:15.773463 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323\": container with ID starting with 46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323 not found: ID does not exist" containerID="46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.773512 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323"} err="failed to get container status \"46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323\": rpc error: code = NotFound desc = could not find container \"46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323\": container with ID starting with 46b32c2262db837c90bcf7cf8e7dd301c4e8f8a7f04ee42f038f090d4b5f3323 not found: ID does not exist" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.773547 4991 scope.go:117] "RemoveContainer" containerID="361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa" Oct 06 08:40:15 crc kubenswrapper[4991]: E1006 08:40:15.773909 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa\": container with ID starting with 361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa not found: ID does not exist" containerID="361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.773933 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa"} err="failed to get container status \"361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa\": rpc error: code = NotFound desc = could not find container \"361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa\": container with ID starting with 361c22109d9a30f028eb52ff29a56e946b8e49594c69a66accb01fff4b459daa not found: ID does not exist" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.777628 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e9816fde-c4d0-4c01-8d09-2af0f4256fd1" (UID: "e9816fde-c4d0-4c01-8d09-2af0f4256fd1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.834472 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:15 crc kubenswrapper[4991]: I1006 08:40:15.834499 4991 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9816fde-c4d0-4c01-8d09-2af0f4256fd1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.049148 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f9885cd76-4cxdt"] Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.056222 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6f9885cd76-4cxdt"] Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.068956 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zzkxl"] Oct 06 08:40:16 crc kubenswrapper[4991]: E1006 08:40:16.069403 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9816fde-c4d0-4c01-8d09-2af0f4256fd1" containerName="neutron-api" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.069415 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9816fde-c4d0-4c01-8d09-2af0f4256fd1" containerName="neutron-api" Oct 06 08:40:16 crc kubenswrapper[4991]: E1006 08:40:16.069442 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b" containerName="mariadb-account-create" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.069448 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b" containerName="mariadb-account-create" Oct 06 08:40:16 crc kubenswrapper[4991]: E1006 08:40:16.069471 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="735f5180-6fe2-4632-b321-f6c96f3c9400" containerName="mariadb-account-create" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.069477 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="735f5180-6fe2-4632-b321-f6c96f3c9400" containerName="mariadb-account-create" Oct 06 08:40:16 crc kubenswrapper[4991]: E1006 08:40:16.069495 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9816fde-c4d0-4c01-8d09-2af0f4256fd1" containerName="neutron-httpd" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.069502 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9816fde-c4d0-4c01-8d09-2af0f4256fd1" containerName="neutron-httpd" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.069662 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9816fde-c4d0-4c01-8d09-2af0f4256fd1" containerName="neutron-api" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.069677 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b" containerName="mariadb-account-create" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.069699 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="735f5180-6fe2-4632-b321-f6c96f3c9400" containerName="mariadb-account-create" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.069711 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9816fde-c4d0-4c01-8d09-2af0f4256fd1" containerName="neutron-httpd" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.070357 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.074123 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.074185 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-djvrz" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.075459 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zzkxl"] Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.077008 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.240945 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bctrb\" (UniqueName: \"kubernetes.io/projected/b5e2805d-da62-4181-8d99-a5180a0c99e7-kube-api-access-bctrb\") pod \"nova-cell0-conductor-db-sync-zzkxl\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.240994 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-config-data\") pod \"nova-cell0-conductor-db-sync-zzkxl\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.241016 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-scripts\") pod \"nova-cell0-conductor-db-sync-zzkxl\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.241068 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zzkxl\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.342937 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bctrb\" (UniqueName: \"kubernetes.io/projected/b5e2805d-da62-4181-8d99-a5180a0c99e7-kube-api-access-bctrb\") pod \"nova-cell0-conductor-db-sync-zzkxl\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.343002 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-config-data\") pod \"nova-cell0-conductor-db-sync-zzkxl\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.343031 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-scripts\") pod \"nova-cell0-conductor-db-sync-zzkxl\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.343107 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zzkxl\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.346842 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zzkxl\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.347918 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-scripts\") pod \"nova-cell0-conductor-db-sync-zzkxl\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.348012 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-config-data\") pod \"nova-cell0-conductor-db-sync-zzkxl\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.361086 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bctrb\" (UniqueName: \"kubernetes.io/projected/b5e2805d-da62-4181-8d99-a5180a0c99e7-kube-api-access-bctrb\") pod \"nova-cell0-conductor-db-sync-zzkxl\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.394587 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.395467 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.702323 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62e0237c-d25f-40ce-9752-5f9605d61912","Type":"ContainerDied","Data":"07c82f86f5435e9ed2843b0557d338c2ca421a602305b8fc5422a8355702925f"} Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.702282 4991 generic.go:334] "Generic (PLEG): container finished" podID="62e0237c-d25f-40ce-9752-5f9605d61912" containerID="07c82f86f5435e9ed2843b0557d338c2ca421a602305b8fc5422a8355702925f" exitCode=0 Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.702656 4991 generic.go:334] "Generic (PLEG): container finished" podID="62e0237c-d25f-40ce-9752-5f9605d61912" containerID="a593008fef4e009d1f98e807ec5f54e4172767d3892b84efb558b98bbcfbc94d" exitCode=2 Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.702689 4991 generic.go:334] "Generic (PLEG): container finished" podID="62e0237c-d25f-40ce-9752-5f9605d61912" containerID="f88af68b518b78719f896c4c87f456eb4221665cf4334316f3fba55ada1c6f0e" exitCode=0 Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.702767 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.702780 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.702793 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62e0237c-d25f-40ce-9752-5f9605d61912","Type":"ContainerDied","Data":"a593008fef4e009d1f98e807ec5f54e4172767d3892b84efb558b98bbcfbc94d"} Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.702828 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62e0237c-d25f-40ce-9752-5f9605d61912","Type":"ContainerDied","Data":"f88af68b518b78719f896c4c87f456eb4221665cf4334316f3fba55ada1c6f0e"} Oct 06 08:40:16 crc kubenswrapper[4991]: I1006 08:40:16.850660 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zzkxl"] Oct 06 08:40:17 crc kubenswrapper[4991]: I1006 08:40:17.238303 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:17 crc kubenswrapper[4991]: I1006 08:40:17.258819 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9816fde-c4d0-4c01-8d09-2af0f4256fd1" path="/var/lib/kubelet/pods/e9816fde-c4d0-4c01-8d09-2af0f4256fd1/volumes" Oct 06 08:40:17 crc kubenswrapper[4991]: I1006 08:40:17.306666 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 08:40:17 crc kubenswrapper[4991]: I1006 08:40:17.716978 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zzkxl" event={"ID":"b5e2805d-da62-4181-8d99-a5180a0c99e7","Type":"ContainerStarted","Data":"9e981bd7b718f206422199db8bc1ecd87c6fef91884534c81c144e697154e96b"} Oct 06 08:40:20 crc kubenswrapper[4991]: I1006 08:40:20.969433 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ea2a-account-create-4hp6d"] Oct 06 08:40:20 crc kubenswrapper[4991]: I1006 08:40:20.971103 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea2a-account-create-4hp6d" Oct 06 08:40:20 crc kubenswrapper[4991]: I1006 08:40:20.980192 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 06 08:40:20 crc kubenswrapper[4991]: I1006 08:40:20.982719 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ea2a-account-create-4hp6d"] Oct 06 08:40:21 crc kubenswrapper[4991]: I1006 08:40:21.138788 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49crs\" (UniqueName: \"kubernetes.io/projected/37e941ea-76c3-43d1-aa41-7897065fb55a-kube-api-access-49crs\") pod \"nova-cell1-ea2a-account-create-4hp6d\" (UID: \"37e941ea-76c3-43d1-aa41-7897065fb55a\") " pod="openstack/nova-cell1-ea2a-account-create-4hp6d" Oct 06 08:40:21 crc kubenswrapper[4991]: I1006 08:40:21.240870 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49crs\" (UniqueName: \"kubernetes.io/projected/37e941ea-76c3-43d1-aa41-7897065fb55a-kube-api-access-49crs\") pod \"nova-cell1-ea2a-account-create-4hp6d\" (UID: \"37e941ea-76c3-43d1-aa41-7897065fb55a\") " pod="openstack/nova-cell1-ea2a-account-create-4hp6d" Oct 06 08:40:21 crc kubenswrapper[4991]: I1006 08:40:21.273585 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49crs\" (UniqueName: \"kubernetes.io/projected/37e941ea-76c3-43d1-aa41-7897065fb55a-kube-api-access-49crs\") pod \"nova-cell1-ea2a-account-create-4hp6d\" (UID: \"37e941ea-76c3-43d1-aa41-7897065fb55a\") " pod="openstack/nova-cell1-ea2a-account-create-4hp6d" Oct 06 08:40:21 crc kubenswrapper[4991]: I1006 08:40:21.300982 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea2a-account-create-4hp6d" Oct 06 08:40:21 crc kubenswrapper[4991]: I1006 08:40:21.952219 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 08:40:21 crc kubenswrapper[4991]: I1006 08:40:21.952708 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 08:40:22 crc kubenswrapper[4991]: I1006 08:40:22.006846 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 08:40:22 crc kubenswrapper[4991]: I1006 08:40:22.015863 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 08:40:22 crc kubenswrapper[4991]: I1006 08:40:22.769824 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 08:40:22 crc kubenswrapper[4991]: I1006 08:40:22.770096 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 08:40:23 crc kubenswrapper[4991]: I1006 08:40:23.792846 4991 generic.go:334] "Generic (PLEG): container finished" podID="62e0237c-d25f-40ce-9752-5f9605d61912" containerID="d31707e60eb3ba97e7046bb66b77143cdb7b5c0f21ecb38d35bca6a020f6dacc" exitCode=0 Oct 06 08:40:23 crc kubenswrapper[4991]: I1006 08:40:23.792939 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62e0237c-d25f-40ce-9752-5f9605d61912","Type":"ContainerDied","Data":"d31707e60eb3ba97e7046bb66b77143cdb7b5c0f21ecb38d35bca6a020f6dacc"} Oct 06 08:40:24 crc kubenswrapper[4991]: I1006 08:40:24.798744 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 08:40:24 crc kubenswrapper[4991]: I1006 08:40:24.813123 4991 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.076361 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.353435 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.416026 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-config-data\") pod \"62e0237c-d25f-40ce-9752-5f9605d61912\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.416194 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62e0237c-d25f-40ce-9752-5f9605d61912-run-httpd\") pod \"62e0237c-d25f-40ce-9752-5f9605d61912\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.416246 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-scripts\") pod \"62e0237c-d25f-40ce-9752-5f9605d61912\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.416410 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-sg-core-conf-yaml\") pod \"62e0237c-d25f-40ce-9752-5f9605d61912\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.416458 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdb7k\" (UniqueName: \"kubernetes.io/projected/62e0237c-d25f-40ce-9752-5f9605d61912-kube-api-access-zdb7k\") pod \"62e0237c-d25f-40ce-9752-5f9605d61912\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.416484 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62e0237c-d25f-40ce-9752-5f9605d61912-log-httpd\") pod \"62e0237c-d25f-40ce-9752-5f9605d61912\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.416527 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-combined-ca-bundle\") pod \"62e0237c-d25f-40ce-9752-5f9605d61912\" (UID: \"62e0237c-d25f-40ce-9752-5f9605d61912\") " Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.418573 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62e0237c-d25f-40ce-9752-5f9605d61912-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "62e0237c-d25f-40ce-9752-5f9605d61912" (UID: "62e0237c-d25f-40ce-9752-5f9605d61912"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.420984 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62e0237c-d25f-40ce-9752-5f9605d61912-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "62e0237c-d25f-40ce-9752-5f9605d61912" (UID: "62e0237c-d25f-40ce-9752-5f9605d61912"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.434217 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-scripts" (OuterVolumeSpecName: "scripts") pod "62e0237c-d25f-40ce-9752-5f9605d61912" (UID: "62e0237c-d25f-40ce-9752-5f9605d61912"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.434381 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62e0237c-d25f-40ce-9752-5f9605d61912-kube-api-access-zdb7k" (OuterVolumeSpecName: "kube-api-access-zdb7k") pod "62e0237c-d25f-40ce-9752-5f9605d61912" (UID: "62e0237c-d25f-40ce-9752-5f9605d61912"). InnerVolumeSpecName "kube-api-access-zdb7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.469166 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "62e0237c-d25f-40ce-9752-5f9605d61912" (UID: "62e0237c-d25f-40ce-9752-5f9605d61912"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.518850 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62e0237c-d25f-40ce-9752-5f9605d61912-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.518894 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.518904 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.518912 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdb7k\" (UniqueName: \"kubernetes.io/projected/62e0237c-d25f-40ce-9752-5f9605d61912-kube-api-access-zdb7k\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.518920 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62e0237c-d25f-40ce-9752-5f9605d61912-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.525602 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62e0237c-d25f-40ce-9752-5f9605d61912" (UID: "62e0237c-d25f-40ce-9752-5f9605d61912"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.560419 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-config-data" (OuterVolumeSpecName: "config-data") pod "62e0237c-d25f-40ce-9752-5f9605d61912" (UID: "62e0237c-d25f-40ce-9752-5f9605d61912"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.620354 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.620385 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e0237c-d25f-40ce-9752-5f9605d61912-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.759657 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ea2a-account-create-4hp6d"] Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.826452 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"62e0237c-d25f-40ce-9752-5f9605d61912","Type":"ContainerDied","Data":"a5f5b2fdc64c0fbabbac848bd69b5d8b191294c2f3d031a18433a900bc6915d9"} Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.826499 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.826525 4991 scope.go:117] "RemoveContainer" containerID="07c82f86f5435e9ed2843b0557d338c2ca421a602305b8fc5422a8355702925f" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.894375 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.944019 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.957039 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:25 crc kubenswrapper[4991]: E1006 08:40:25.957447 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="ceilometer-notification-agent" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.957469 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="ceilometer-notification-agent" Oct 06 08:40:25 crc kubenswrapper[4991]: E1006 08:40:25.957501 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="ceilometer-central-agent" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.957509 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="ceilometer-central-agent" Oct 06 08:40:25 crc kubenswrapper[4991]: E1006 08:40:25.957524 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="sg-core" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.957530 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="sg-core" Oct 06 08:40:25 crc kubenswrapper[4991]: E1006 08:40:25.957560 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="proxy-httpd" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.957566 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="proxy-httpd" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.957728 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="proxy-httpd" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.957746 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="ceilometer-notification-agent" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.957764 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="sg-core" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.957778 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" containerName="ceilometer-central-agent" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.961778 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.964961 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.965160 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:40:25 crc kubenswrapper[4991]: I1006 08:40:25.968756 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.150499 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-scripts\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.151273 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.151515 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cae023c5-768c-4ed7-8722-c74c061e7657-log-httpd\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.151665 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-config-data\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.151742 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.151808 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql5md\" (UniqueName: \"kubernetes.io/projected/cae023c5-768c-4ed7-8722-c74c061e7657-kube-api-access-ql5md\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.151878 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cae023c5-768c-4ed7-8722-c74c061e7657-run-httpd\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.253027 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-config-data\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.253091 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.253154 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql5md\" (UniqueName: \"kubernetes.io/projected/cae023c5-768c-4ed7-8722-c74c061e7657-kube-api-access-ql5md\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.253186 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cae023c5-768c-4ed7-8722-c74c061e7657-run-httpd\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.253239 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-scripts\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.253263 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.253378 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cae023c5-768c-4ed7-8722-c74c061e7657-log-httpd\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.253762 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cae023c5-768c-4ed7-8722-c74c061e7657-log-httpd\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.254199 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cae023c5-768c-4ed7-8722-c74c061e7657-run-httpd\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.257783 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.258099 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.269817 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-scripts\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.271159 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-config-data\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.272640 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql5md\" (UniqueName: \"kubernetes.io/projected/cae023c5-768c-4ed7-8722-c74c061e7657-kube-api-access-ql5md\") pod \"ceilometer-0\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.290679 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:26 crc kubenswrapper[4991]: W1006 08:40:26.851848 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e941ea_76c3_43d1_aa41_7897065fb55a.slice/crio-be90d72d34727ffa6ecc8849c0e97f3caf3b12bb35d7666c5a7089ce91d1d0c1 WatchSource:0}: Error finding container be90d72d34727ffa6ecc8849c0e97f3caf3b12bb35d7666c5a7089ce91d1d0c1: Status 404 returned error can't find the container with id be90d72d34727ffa6ecc8849c0e97f3caf3b12bb35d7666c5a7089ce91d1d0c1 Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.873586 4991 scope.go:117] "RemoveContainer" containerID="a593008fef4e009d1f98e807ec5f54e4172767d3892b84efb558b98bbcfbc94d" Oct 06 08:40:26 crc kubenswrapper[4991]: I1006 08:40:26.920850 4991 scope.go:117] "RemoveContainer" containerID="f88af68b518b78719f896c4c87f456eb4221665cf4334316f3fba55ada1c6f0e" Oct 06 08:40:27 crc kubenswrapper[4991]: I1006 08:40:27.067429 4991 scope.go:117] "RemoveContainer" containerID="d31707e60eb3ba97e7046bb66b77143cdb7b5c0f21ecb38d35bca6a020f6dacc" Oct 06 08:40:27 crc kubenswrapper[4991]: I1006 08:40:27.257079 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62e0237c-d25f-40ce-9752-5f9605d61912" path="/var/lib/kubelet/pods/62e0237c-d25f-40ce-9752-5f9605d61912/volumes" Oct 06 08:40:27 crc kubenswrapper[4991]: I1006 08:40:27.377415 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:27 crc kubenswrapper[4991]: W1006 08:40:27.383143 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcae023c5_768c_4ed7_8722_c74c061e7657.slice/crio-ac4de0657c370534078c320c58af93562d3bd23d49a13e866e3590f88ae3ff60 WatchSource:0}: Error finding container ac4de0657c370534078c320c58af93562d3bd23d49a13e866e3590f88ae3ff60: Status 404 returned error can't find the container with id ac4de0657c370534078c320c58af93562d3bd23d49a13e866e3590f88ae3ff60 Oct 06 08:40:27 crc kubenswrapper[4991]: I1006 08:40:27.529071 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:40:27 crc kubenswrapper[4991]: I1006 08:40:27.529397 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:40:27 crc kubenswrapper[4991]: I1006 08:40:27.847975 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zzkxl" event={"ID":"b5e2805d-da62-4181-8d99-a5180a0c99e7","Type":"ContainerStarted","Data":"455474fa51c249bc946c92e04261ca0b1c51eb0b6f215aa5fd14c0a8eb825d65"} Oct 06 08:40:27 crc kubenswrapper[4991]: I1006 08:40:27.854175 4991 generic.go:334] "Generic (PLEG): container finished" podID="37e941ea-76c3-43d1-aa41-7897065fb55a" containerID="bb32cfe19b795b45e56a5fe6cf63e74c7d601ad82e46c5e7cffe99ccb8d6d994" exitCode=0 Oct 06 08:40:27 crc kubenswrapper[4991]: I1006 08:40:27.855171 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ea2a-account-create-4hp6d" event={"ID":"37e941ea-76c3-43d1-aa41-7897065fb55a","Type":"ContainerDied","Data":"bb32cfe19b795b45e56a5fe6cf63e74c7d601ad82e46c5e7cffe99ccb8d6d994"} Oct 06 08:40:27 crc kubenswrapper[4991]: I1006 08:40:27.855205 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ea2a-account-create-4hp6d" event={"ID":"37e941ea-76c3-43d1-aa41-7897065fb55a","Type":"ContainerStarted","Data":"be90d72d34727ffa6ecc8849c0e97f3caf3b12bb35d7666c5a7089ce91d1d0c1"} Oct 06 08:40:27 crc kubenswrapper[4991]: I1006 08:40:27.856930 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cae023c5-768c-4ed7-8722-c74c061e7657","Type":"ContainerStarted","Data":"ac4de0657c370534078c320c58af93562d3bd23d49a13e866e3590f88ae3ff60"} Oct 06 08:40:27 crc kubenswrapper[4991]: I1006 08:40:27.881031 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zzkxl" podStartSLOduration=1.80455179 podStartE2EDuration="11.88100964s" podCreationTimestamp="2025-10-06 08:40:16 +0000 UTC" firstStartedPulling="2025-10-06 08:40:16.859229782 +0000 UTC m=+1268.596979793" lastFinishedPulling="2025-10-06 08:40:26.935687622 +0000 UTC m=+1278.673437643" observedRunningTime="2025-10-06 08:40:27.870824636 +0000 UTC m=+1279.608574667" watchObservedRunningTime="2025-10-06 08:40:27.88100964 +0000 UTC m=+1279.618759661" Oct 06 08:40:28 crc kubenswrapper[4991]: I1006 08:40:28.868897 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cae023c5-768c-4ed7-8722-c74c061e7657","Type":"ContainerStarted","Data":"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be"} Oct 06 08:40:28 crc kubenswrapper[4991]: I1006 08:40:28.869161 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cae023c5-768c-4ed7-8722-c74c061e7657","Type":"ContainerStarted","Data":"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13"} Oct 06 08:40:29 crc kubenswrapper[4991]: I1006 08:40:29.274138 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea2a-account-create-4hp6d" Oct 06 08:40:29 crc kubenswrapper[4991]: I1006 08:40:29.422722 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49crs\" (UniqueName: \"kubernetes.io/projected/37e941ea-76c3-43d1-aa41-7897065fb55a-kube-api-access-49crs\") pod \"37e941ea-76c3-43d1-aa41-7897065fb55a\" (UID: \"37e941ea-76c3-43d1-aa41-7897065fb55a\") " Oct 06 08:40:29 crc kubenswrapper[4991]: I1006 08:40:29.427917 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e941ea-76c3-43d1-aa41-7897065fb55a-kube-api-access-49crs" (OuterVolumeSpecName: "kube-api-access-49crs") pod "37e941ea-76c3-43d1-aa41-7897065fb55a" (UID: "37e941ea-76c3-43d1-aa41-7897065fb55a"). InnerVolumeSpecName "kube-api-access-49crs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:29 crc kubenswrapper[4991]: I1006 08:40:29.525397 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49crs\" (UniqueName: \"kubernetes.io/projected/37e941ea-76c3-43d1-aa41-7897065fb55a-kube-api-access-49crs\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:29 crc kubenswrapper[4991]: I1006 08:40:29.878644 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ea2a-account-create-4hp6d" Oct 06 08:40:29 crc kubenswrapper[4991]: I1006 08:40:29.878634 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ea2a-account-create-4hp6d" event={"ID":"37e941ea-76c3-43d1-aa41-7897065fb55a","Type":"ContainerDied","Data":"be90d72d34727ffa6ecc8849c0e97f3caf3b12bb35d7666c5a7089ce91d1d0c1"} Oct 06 08:40:29 crc kubenswrapper[4991]: I1006 08:40:29.878757 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be90d72d34727ffa6ecc8849c0e97f3caf3b12bb35d7666c5a7089ce91d1d0c1" Oct 06 08:40:29 crc kubenswrapper[4991]: I1006 08:40:29.880936 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cae023c5-768c-4ed7-8722-c74c061e7657","Type":"ContainerStarted","Data":"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933"} Oct 06 08:40:31 crc kubenswrapper[4991]: I1006 08:40:31.905089 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cae023c5-768c-4ed7-8722-c74c061e7657","Type":"ContainerStarted","Data":"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a"} Oct 06 08:40:31 crc kubenswrapper[4991]: I1006 08:40:31.905795 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:40:31 crc kubenswrapper[4991]: I1006 08:40:31.937251 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.595449647 podStartE2EDuration="6.93723052s" podCreationTimestamp="2025-10-06 08:40:25 +0000 UTC" firstStartedPulling="2025-10-06 08:40:27.385581365 +0000 UTC m=+1279.123331386" lastFinishedPulling="2025-10-06 08:40:30.727362218 +0000 UTC m=+1282.465112259" observedRunningTime="2025-10-06 08:40:31.931379576 +0000 UTC m=+1283.669129637" watchObservedRunningTime="2025-10-06 08:40:31.93723052 +0000 UTC m=+1283.674980551" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.029207 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.030121 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="ceilometer-central-agent" containerID="cri-o://136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13" gracePeriod=30 Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.030288 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="ceilometer-notification-agent" containerID="cri-o://b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be" gracePeriod=30 Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.030459 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="sg-core" containerID="cri-o://f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933" gracePeriod=30 Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.035684 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="proxy-httpd" containerID="cri-o://2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a" gracePeriod=30 Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.728357 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.816351 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cae023c5-768c-4ed7-8722-c74c061e7657-run-httpd\") pod \"cae023c5-768c-4ed7-8722-c74c061e7657\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.816395 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-combined-ca-bundle\") pod \"cae023c5-768c-4ed7-8722-c74c061e7657\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.816424 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql5md\" (UniqueName: \"kubernetes.io/projected/cae023c5-768c-4ed7-8722-c74c061e7657-kube-api-access-ql5md\") pod \"cae023c5-768c-4ed7-8722-c74c061e7657\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.816467 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-config-data\") pod \"cae023c5-768c-4ed7-8722-c74c061e7657\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.816503 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-sg-core-conf-yaml\") pod \"cae023c5-768c-4ed7-8722-c74c061e7657\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.816576 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cae023c5-768c-4ed7-8722-c74c061e7657-log-httpd\") pod \"cae023c5-768c-4ed7-8722-c74c061e7657\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.816646 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-scripts\") pod \"cae023c5-768c-4ed7-8722-c74c061e7657\" (UID: \"cae023c5-768c-4ed7-8722-c74c061e7657\") " Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.816814 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cae023c5-768c-4ed7-8722-c74c061e7657-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cae023c5-768c-4ed7-8722-c74c061e7657" (UID: "cae023c5-768c-4ed7-8722-c74c061e7657"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.817029 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cae023c5-768c-4ed7-8722-c74c061e7657-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.817357 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cae023c5-768c-4ed7-8722-c74c061e7657-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cae023c5-768c-4ed7-8722-c74c061e7657" (UID: "cae023c5-768c-4ed7-8722-c74c061e7657"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.822208 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-scripts" (OuterVolumeSpecName: "scripts") pod "cae023c5-768c-4ed7-8722-c74c061e7657" (UID: "cae023c5-768c-4ed7-8722-c74c061e7657"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.822288 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae023c5-768c-4ed7-8722-c74c061e7657-kube-api-access-ql5md" (OuterVolumeSpecName: "kube-api-access-ql5md") pod "cae023c5-768c-4ed7-8722-c74c061e7657" (UID: "cae023c5-768c-4ed7-8722-c74c061e7657"). InnerVolumeSpecName "kube-api-access-ql5md". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.843450 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cae023c5-768c-4ed7-8722-c74c061e7657" (UID: "cae023c5-768c-4ed7-8722-c74c061e7657"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.887012 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cae023c5-768c-4ed7-8722-c74c061e7657" (UID: "cae023c5-768c-4ed7-8722-c74c061e7657"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.908476 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-config-data" (OuterVolumeSpecName: "config-data") pod "cae023c5-768c-4ed7-8722-c74c061e7657" (UID: "cae023c5-768c-4ed7-8722-c74c061e7657"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.920174 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.920257 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql5md\" (UniqueName: \"kubernetes.io/projected/cae023c5-768c-4ed7-8722-c74c061e7657-kube-api-access-ql5md\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.920274 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.920457 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.920478 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cae023c5-768c-4ed7-8722-c74c061e7657-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.920490 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cae023c5-768c-4ed7-8722-c74c061e7657-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.933176 4991 generic.go:334] "Generic (PLEG): container finished" podID="cae023c5-768c-4ed7-8722-c74c061e7657" containerID="2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a" exitCode=0 Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.933212 4991 generic.go:334] "Generic (PLEG): container finished" podID="cae023c5-768c-4ed7-8722-c74c061e7657" containerID="f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933" exitCode=2 Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.933221 4991 generic.go:334] "Generic (PLEG): container finished" podID="cae023c5-768c-4ed7-8722-c74c061e7657" containerID="b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be" exitCode=0 Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.933229 4991 generic.go:334] "Generic (PLEG): container finished" podID="cae023c5-768c-4ed7-8722-c74c061e7657" containerID="136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13" exitCode=0 Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.933251 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.933254 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cae023c5-768c-4ed7-8722-c74c061e7657","Type":"ContainerDied","Data":"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a"} Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.933336 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cae023c5-768c-4ed7-8722-c74c061e7657","Type":"ContainerDied","Data":"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933"} Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.933348 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cae023c5-768c-4ed7-8722-c74c061e7657","Type":"ContainerDied","Data":"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be"} Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.933357 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cae023c5-768c-4ed7-8722-c74c061e7657","Type":"ContainerDied","Data":"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13"} Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.933366 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cae023c5-768c-4ed7-8722-c74c061e7657","Type":"ContainerDied","Data":"ac4de0657c370534078c320c58af93562d3bd23d49a13e866e3590f88ae3ff60"} Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.933381 4991 scope.go:117] "RemoveContainer" containerID="2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.958178 4991 scope.go:117] "RemoveContainer" containerID="f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.963354 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.971215 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.992601 4991 scope.go:117] "RemoveContainer" containerID="b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.993829 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:34 crc kubenswrapper[4991]: E1006 08:40:34.994429 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="ceilometer-central-agent" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.994453 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="ceilometer-central-agent" Oct 06 08:40:34 crc kubenswrapper[4991]: E1006 08:40:34.994492 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="sg-core" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.994500 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="sg-core" Oct 06 08:40:34 crc kubenswrapper[4991]: E1006 08:40:34.994522 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e941ea-76c3-43d1-aa41-7897065fb55a" containerName="mariadb-account-create" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.994531 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e941ea-76c3-43d1-aa41-7897065fb55a" containerName="mariadb-account-create" Oct 06 08:40:34 crc kubenswrapper[4991]: E1006 08:40:34.994565 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="ceilometer-notification-agent" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.994577 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="ceilometer-notification-agent" Oct 06 08:40:34 crc kubenswrapper[4991]: E1006 08:40:34.994603 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="proxy-httpd" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.994610 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="proxy-httpd" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.994889 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="sg-core" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.994908 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="proxy-httpd" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.994920 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="ceilometer-central-agent" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.994931 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" containerName="ceilometer-notification-agent" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.994966 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e941ea-76c3-43d1-aa41-7897065fb55a" containerName="mariadb-account-create" Oct 06 08:40:34 crc kubenswrapper[4991]: I1006 08:40:34.998079 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.002337 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.002387 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.010042 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.027630 4991 scope.go:117] "RemoveContainer" containerID="136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.052749 4991 scope.go:117] "RemoveContainer" containerID="2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a" Oct 06 08:40:35 crc kubenswrapper[4991]: E1006 08:40:35.053560 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a\": container with ID starting with 2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a not found: ID does not exist" containerID="2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.053606 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a"} err="failed to get container status \"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a\": rpc error: code = NotFound desc = could not find container \"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a\": container with ID starting with 2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.053646 4991 scope.go:117] "RemoveContainer" containerID="f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933" Oct 06 08:40:35 crc kubenswrapper[4991]: E1006 08:40:35.054264 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933\": container with ID starting with f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933 not found: ID does not exist" containerID="f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.054314 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933"} err="failed to get container status \"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933\": rpc error: code = NotFound desc = could not find container \"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933\": container with ID starting with f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933 not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.054345 4991 scope.go:117] "RemoveContainer" containerID="b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be" Oct 06 08:40:35 crc kubenswrapper[4991]: E1006 08:40:35.054714 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be\": container with ID starting with b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be not found: ID does not exist" containerID="b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.054740 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be"} err="failed to get container status \"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be\": rpc error: code = NotFound desc = could not find container \"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be\": container with ID starting with b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.054754 4991 scope.go:117] "RemoveContainer" containerID="136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13" Oct 06 08:40:35 crc kubenswrapper[4991]: E1006 08:40:35.055210 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13\": container with ID starting with 136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13 not found: ID does not exist" containerID="136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.055238 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13"} err="failed to get container status \"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13\": rpc error: code = NotFound desc = could not find container \"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13\": container with ID starting with 136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13 not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.055255 4991 scope.go:117] "RemoveContainer" containerID="2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.055918 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a"} err="failed to get container status \"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a\": rpc error: code = NotFound desc = could not find container \"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a\": container with ID starting with 2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.055942 4991 scope.go:117] "RemoveContainer" containerID="f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.056197 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933"} err="failed to get container status \"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933\": rpc error: code = NotFound desc = could not find container \"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933\": container with ID starting with f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933 not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.056225 4991 scope.go:117] "RemoveContainer" containerID="b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.056419 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be"} err="failed to get container status \"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be\": rpc error: code = NotFound desc = could not find container \"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be\": container with ID starting with b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.056442 4991 scope.go:117] "RemoveContainer" containerID="136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.056660 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13"} err="failed to get container status \"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13\": rpc error: code = NotFound desc = could not find container \"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13\": container with ID starting with 136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13 not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.056682 4991 scope.go:117] "RemoveContainer" containerID="2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.056966 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a"} err="failed to get container status \"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a\": rpc error: code = NotFound desc = could not find container \"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a\": container with ID starting with 2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.057013 4991 scope.go:117] "RemoveContainer" containerID="f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.057285 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933"} err="failed to get container status \"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933\": rpc error: code = NotFound desc = could not find container \"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933\": container with ID starting with f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933 not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.057362 4991 scope.go:117] "RemoveContainer" containerID="b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.057580 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be"} err="failed to get container status \"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be\": rpc error: code = NotFound desc = could not find container \"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be\": container with ID starting with b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.057598 4991 scope.go:117] "RemoveContainer" containerID="136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.057807 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13"} err="failed to get container status \"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13\": rpc error: code = NotFound desc = could not find container \"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13\": container with ID starting with 136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13 not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.057824 4991 scope.go:117] "RemoveContainer" containerID="2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.058052 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a"} err="failed to get container status \"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a\": rpc error: code = NotFound desc = could not find container \"2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a\": container with ID starting with 2641cc4eaad77cc857c76b5bfea4c80b2ecce6cdbd1b0e21993565f663c1423a not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.058077 4991 scope.go:117] "RemoveContainer" containerID="f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.058274 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933"} err="failed to get container status \"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933\": rpc error: code = NotFound desc = could not find container \"f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933\": container with ID starting with f4557686faf321d2889a460bdb41688fe4c995f1f5ef0d88e135149f2a755933 not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.058289 4991 scope.go:117] "RemoveContainer" containerID="b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.058627 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be"} err="failed to get container status \"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be\": rpc error: code = NotFound desc = could not find container \"b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be\": container with ID starting with b75a7a9587dcdbab205b06832c46ff19f460369c29decad6e339e214903437be not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.058643 4991 scope.go:117] "RemoveContainer" containerID="136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.058812 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13"} err="failed to get container status \"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13\": rpc error: code = NotFound desc = could not find container \"136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13\": container with ID starting with 136435c259dfdff8dd93439894c72adc80500041b35d950e6395c12811083f13 not found: ID does not exist" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.124147 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-config-data\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.124265 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.124353 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-run-httpd\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.124437 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-log-httpd\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.124669 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-scripts\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.124725 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.124756 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zx4t\" (UniqueName: \"kubernetes.io/projected/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-kube-api-access-5zx4t\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.226152 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-config-data\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.226202 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.226226 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-run-httpd\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.226255 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-log-httpd\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.226327 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-scripts\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.226345 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.226363 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zx4t\" (UniqueName: \"kubernetes.io/projected/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-kube-api-access-5zx4t\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.227034 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-run-httpd\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.227066 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-log-httpd\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.230609 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.230813 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.231419 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-config-data\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.232273 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-scripts\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.247564 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zx4t\" (UniqueName: \"kubernetes.io/projected/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-kube-api-access-5zx4t\") pod \"ceilometer-0\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.254004 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae023c5-768c-4ed7-8722-c74c061e7657" path="/var/lib/kubelet/pods/cae023c5-768c-4ed7-8722-c74c061e7657/volumes" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.331015 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.808018 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:40:35 crc kubenswrapper[4991]: W1006 08:40:35.812483 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8734a1c0_e8fd_46bc_90d4_6a7edcce1e2c.slice/crio-163a665e549e12f2fe3cff3b22897caf0563180273e0b9cbdd9676b07530f94b WatchSource:0}: Error finding container 163a665e549e12f2fe3cff3b22897caf0563180273e0b9cbdd9676b07530f94b: Status 404 returned error can't find the container with id 163a665e549e12f2fe3cff3b22897caf0563180273e0b9cbdd9676b07530f94b Oct 06 08:40:35 crc kubenswrapper[4991]: I1006 08:40:35.964010 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c","Type":"ContainerStarted","Data":"163a665e549e12f2fe3cff3b22897caf0563180273e0b9cbdd9676b07530f94b"} Oct 06 08:40:37 crc kubenswrapper[4991]: I1006 08:40:37.024682 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c","Type":"ContainerStarted","Data":"76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4"} Oct 06 08:40:38 crc kubenswrapper[4991]: I1006 08:40:38.035010 4991 generic.go:334] "Generic (PLEG): container finished" podID="b5e2805d-da62-4181-8d99-a5180a0c99e7" containerID="455474fa51c249bc946c92e04261ca0b1c51eb0b6f215aa5fd14c0a8eb825d65" exitCode=0 Oct 06 08:40:38 crc kubenswrapper[4991]: I1006 08:40:38.035099 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zzkxl" event={"ID":"b5e2805d-da62-4181-8d99-a5180a0c99e7","Type":"ContainerDied","Data":"455474fa51c249bc946c92e04261ca0b1c51eb0b6f215aa5fd14c0a8eb825d65"} Oct 06 08:40:38 crc kubenswrapper[4991]: I1006 08:40:38.038560 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c","Type":"ContainerStarted","Data":"20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208"} Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.047654 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c","Type":"ContainerStarted","Data":"a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9"} Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.404826 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.559476 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-scripts\") pod \"b5e2805d-da62-4181-8d99-a5180a0c99e7\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.559617 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-combined-ca-bundle\") pod \"b5e2805d-da62-4181-8d99-a5180a0c99e7\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.559659 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bctrb\" (UniqueName: \"kubernetes.io/projected/b5e2805d-da62-4181-8d99-a5180a0c99e7-kube-api-access-bctrb\") pod \"b5e2805d-da62-4181-8d99-a5180a0c99e7\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.559735 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-config-data\") pod \"b5e2805d-da62-4181-8d99-a5180a0c99e7\" (UID: \"b5e2805d-da62-4181-8d99-a5180a0c99e7\") " Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.565936 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-scripts" (OuterVolumeSpecName: "scripts") pod "b5e2805d-da62-4181-8d99-a5180a0c99e7" (UID: "b5e2805d-da62-4181-8d99-a5180a0c99e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.578385 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e2805d-da62-4181-8d99-a5180a0c99e7-kube-api-access-bctrb" (OuterVolumeSpecName: "kube-api-access-bctrb") pod "b5e2805d-da62-4181-8d99-a5180a0c99e7" (UID: "b5e2805d-da62-4181-8d99-a5180a0c99e7"). InnerVolumeSpecName "kube-api-access-bctrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.589771 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-config-data" (OuterVolumeSpecName: "config-data") pod "b5e2805d-da62-4181-8d99-a5180a0c99e7" (UID: "b5e2805d-da62-4181-8d99-a5180a0c99e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.593337 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5e2805d-da62-4181-8d99-a5180a0c99e7" (UID: "b5e2805d-da62-4181-8d99-a5180a0c99e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.685334 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.685391 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.685404 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bctrb\" (UniqueName: \"kubernetes.io/projected/b5e2805d-da62-4181-8d99-a5180a0c99e7-kube-api-access-bctrb\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:39 crc kubenswrapper[4991]: I1006 08:40:39.685413 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e2805d-da62-4181-8d99-a5180a0c99e7-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.059318 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c","Type":"ContainerStarted","Data":"d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d"} Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.062513 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zzkxl" event={"ID":"b5e2805d-da62-4181-8d99-a5180a0c99e7","Type":"ContainerDied","Data":"9e981bd7b718f206422199db8bc1ecd87c6fef91884534c81c144e697154e96b"} Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.062549 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e981bd7b718f206422199db8bc1ecd87c6fef91884534c81c144e697154e96b" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.062621 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zzkxl" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.147526 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 08:40:40 crc kubenswrapper[4991]: E1006 08:40:40.147981 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e2805d-da62-4181-8d99-a5180a0c99e7" containerName="nova-cell0-conductor-db-sync" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.147999 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e2805d-da62-4181-8d99-a5180a0c99e7" containerName="nova-cell0-conductor-db-sync" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.148192 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e2805d-da62-4181-8d99-a5180a0c99e7" containerName="nova-cell0-conductor-db-sync" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.148827 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.150931 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.151257 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-djvrz" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.159861 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.194112 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697548ef-9b89-4827-a5f1-4e535ae94722-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"697548ef-9b89-4827-a5f1-4e535ae94722\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.194180 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plvkz\" (UniqueName: \"kubernetes.io/projected/697548ef-9b89-4827-a5f1-4e535ae94722-kube-api-access-plvkz\") pod \"nova-cell0-conductor-0\" (UID: \"697548ef-9b89-4827-a5f1-4e535ae94722\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.194243 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697548ef-9b89-4827-a5f1-4e535ae94722-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"697548ef-9b89-4827-a5f1-4e535ae94722\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.295620 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697548ef-9b89-4827-a5f1-4e535ae94722-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"697548ef-9b89-4827-a5f1-4e535ae94722\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.295684 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plvkz\" (UniqueName: \"kubernetes.io/projected/697548ef-9b89-4827-a5f1-4e535ae94722-kube-api-access-plvkz\") pod \"nova-cell0-conductor-0\" (UID: \"697548ef-9b89-4827-a5f1-4e535ae94722\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.295763 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697548ef-9b89-4827-a5f1-4e535ae94722-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"697548ef-9b89-4827-a5f1-4e535ae94722\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.299330 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697548ef-9b89-4827-a5f1-4e535ae94722-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"697548ef-9b89-4827-a5f1-4e535ae94722\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.299552 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697548ef-9b89-4827-a5f1-4e535ae94722-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"697548ef-9b89-4827-a5f1-4e535ae94722\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.310849 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plvkz\" (UniqueName: \"kubernetes.io/projected/697548ef-9b89-4827-a5f1-4e535ae94722-kube-api-access-plvkz\") pod \"nova-cell0-conductor-0\" (UID: \"697548ef-9b89-4827-a5f1-4e535ae94722\") " pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.467282 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:40 crc kubenswrapper[4991]: W1006 08:40:40.890692 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod697548ef_9b89_4827_a5f1_4e535ae94722.slice/crio-91e294a7b3bb344318caee75d31a75ac164141d0fe2fb46d42bd4c98fc504e8a WatchSource:0}: Error finding container 91e294a7b3bb344318caee75d31a75ac164141d0fe2fb46d42bd4c98fc504e8a: Status 404 returned error can't find the container with id 91e294a7b3bb344318caee75d31a75ac164141d0fe2fb46d42bd4c98fc504e8a Oct 06 08:40:40 crc kubenswrapper[4991]: I1006 08:40:40.891169 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 08:40:41 crc kubenswrapper[4991]: I1006 08:40:41.074055 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"697548ef-9b89-4827-a5f1-4e535ae94722","Type":"ContainerStarted","Data":"91e294a7b3bb344318caee75d31a75ac164141d0fe2fb46d42bd4c98fc504e8a"} Oct 06 08:40:41 crc kubenswrapper[4991]: I1006 08:40:41.075336 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:40:41 crc kubenswrapper[4991]: I1006 08:40:41.107702 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.040260549 podStartE2EDuration="7.107685732s" podCreationTimestamp="2025-10-06 08:40:34 +0000 UTC" firstStartedPulling="2025-10-06 08:40:35.814898183 +0000 UTC m=+1287.552648204" lastFinishedPulling="2025-10-06 08:40:39.882323366 +0000 UTC m=+1291.620073387" observedRunningTime="2025-10-06 08:40:41.09511971 +0000 UTC m=+1292.832869731" watchObservedRunningTime="2025-10-06 08:40:41.107685732 +0000 UTC m=+1292.845435753" Oct 06 08:40:42 crc kubenswrapper[4991]: I1006 08:40:42.088995 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"697548ef-9b89-4827-a5f1-4e535ae94722","Type":"ContainerStarted","Data":"758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98"} Oct 06 08:40:42 crc kubenswrapper[4991]: I1006 08:40:42.089379 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:42 crc kubenswrapper[4991]: I1006 08:40:42.115674 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.115646564 podStartE2EDuration="2.115646564s" podCreationTimestamp="2025-10-06 08:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:42.110654964 +0000 UTC m=+1293.848404985" watchObservedRunningTime="2025-10-06 08:40:42.115646564 +0000 UTC m=+1293.853396585" Oct 06 08:40:50 crc kubenswrapper[4991]: I1006 08:40:50.497197 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.075014 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-fswkr"] Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.076453 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.078580 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.078686 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.091815 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fswkr"] Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.213351 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.214510 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.217100 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.232055 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.232869 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-scripts\") pod \"nova-cell0-cell-mapping-fswkr\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.232929 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzkvx\" (UniqueName: \"kubernetes.io/projected/ec36c4e8-0d7b-4570-bb22-16367889063f-kube-api-access-nzkvx\") pod \"nova-cell0-cell-mapping-fswkr\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.232955 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-config-data\") pod \"nova-cell0-cell-mapping-fswkr\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.233006 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fswkr\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.335001 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-scripts\") pod \"nova-cell0-cell-mapping-fswkr\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.335079 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50e0cec2-978e-4758-9881-c2ae6db0b8c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.335136 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzkvx\" (UniqueName: \"kubernetes.io/projected/ec36c4e8-0d7b-4570-bb22-16367889063f-kube-api-access-nzkvx\") pod \"nova-cell0-cell-mapping-fswkr\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.335164 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-config-data\") pod \"nova-cell0-cell-mapping-fswkr\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.335219 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fswkr\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.335237 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50e0cec2-978e-4758-9881-c2ae6db0b8c5-config-data\") pod \"nova-scheduler-0\" (UID: \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.335271 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnfw5\" (UniqueName: \"kubernetes.io/projected/50e0cec2-978e-4758-9881-c2ae6db0b8c5-kube-api-access-rnfw5\") pod \"nova-scheduler-0\" (UID: \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.348892 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fswkr\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.358797 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-scripts\") pod \"nova-cell0-cell-mapping-fswkr\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.360900 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-config-data\") pod \"nova-cell0-cell-mapping-fswkr\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.372861 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.382492 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.391862 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.395134 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzkvx\" (UniqueName: \"kubernetes.io/projected/ec36c4e8-0d7b-4570-bb22-16367889063f-kube-api-access-nzkvx\") pod \"nova-cell0-cell-mapping-fswkr\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.400477 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.402308 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.403599 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.411114 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.412484 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.435699 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.440441 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50e0cec2-978e-4758-9881-c2ae6db0b8c5-config-data\") pod \"nova-scheduler-0\" (UID: \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.440548 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnfw5\" (UniqueName: \"kubernetes.io/projected/50e0cec2-978e-4758-9881-c2ae6db0b8c5-kube-api-access-rnfw5\") pod \"nova-scheduler-0\" (UID: \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.440605 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce14ecce-dbea-4d36-8308-6cc10f920cb5-logs\") pod \"nova-metadata-0\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.440646 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce14ecce-dbea-4d36-8308-6cc10f920cb5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.440696 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce14ecce-dbea-4d36-8308-6cc10f920cb5-config-data\") pod \"nova-metadata-0\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.440733 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50e0cec2-978e-4758-9881-c2ae6db0b8c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.440844 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9gfl\" (UniqueName: \"kubernetes.io/projected/ce14ecce-dbea-4d36-8308-6cc10f920cb5-kube-api-access-g9gfl\") pod \"nova-metadata-0\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.488407 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.489993 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.494667 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.500929 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50e0cec2-978e-4758-9881-c2ae6db0b8c5-config-data\") pod \"nova-scheduler-0\" (UID: \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.510802 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50e0cec2-978e-4758-9881-c2ae6db0b8c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.526617 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.541399 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnfw5\" (UniqueName: \"kubernetes.io/projected/50e0cec2-978e-4758-9881-c2ae6db0b8c5-kube-api-access-rnfw5\") pod \"nova-scheduler-0\" (UID: \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\") " pod="openstack/nova-scheduler-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.542729 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9gfl\" (UniqueName: \"kubernetes.io/projected/ce14ecce-dbea-4d36-8308-6cc10f920cb5-kube-api-access-g9gfl\") pod \"nova-metadata-0\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.542788 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71fc75d-0a11-4673-a14d-90f3269ff26f-logs\") pod \"nova-api-0\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.542812 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzt6d\" (UniqueName: \"kubernetes.io/projected/c71fc75d-0a11-4673-a14d-90f3269ff26f-kube-api-access-kzt6d\") pod \"nova-api-0\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.542857 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71fc75d-0a11-4673-a14d-90f3269ff26f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.542871 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71fc75d-0a11-4673-a14d-90f3269ff26f-config-data\") pod \"nova-api-0\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.542897 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce14ecce-dbea-4d36-8308-6cc10f920cb5-logs\") pod \"nova-metadata-0\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.542918 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce14ecce-dbea-4d36-8308-6cc10f920cb5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.542945 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce14ecce-dbea-4d36-8308-6cc10f920cb5-config-data\") pod \"nova-metadata-0\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.551712 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce14ecce-dbea-4d36-8308-6cc10f920cb5-config-data\") pod \"nova-metadata-0\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.551817 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce14ecce-dbea-4d36-8308-6cc10f920cb5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.551824 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.556279 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce14ecce-dbea-4d36-8308-6cc10f920cb5-logs\") pod \"nova-metadata-0\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.569412 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-rtlp9"] Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.571110 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.571703 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9gfl\" (UniqueName: \"kubernetes.io/projected/ce14ecce-dbea-4d36-8308-6cc10f920cb5-kube-api-access-g9gfl\") pod \"nova-metadata-0\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.618404 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-rtlp9"] Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.634017 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.645935 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71fc75d-0a11-4673-a14d-90f3269ff26f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.646118 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71fc75d-0a11-4673-a14d-90f3269ff26f-config-data\") pod \"nova-api-0\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.646246 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2z4f\" (UniqueName: \"kubernetes.io/projected/a4f23dd7-0459-4c71-86af-7b589d466e9d-kube-api-access-j2z4f\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.646389 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh55j\" (UniqueName: \"kubernetes.io/projected/ebecc338-537f-4b31-b992-9cc46c89ea19-kube-api-access-gh55j\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebecc338-537f-4b31-b992-9cc46c89ea19\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.646526 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-dns-svc\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.646741 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebecc338-537f-4b31-b992-9cc46c89ea19-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebecc338-537f-4b31-b992-9cc46c89ea19\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.646887 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.647190 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-config\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.647623 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71fc75d-0a11-4673-a14d-90f3269ff26f-logs\") pod \"nova-api-0\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.647816 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.647943 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzt6d\" (UniqueName: \"kubernetes.io/projected/c71fc75d-0a11-4673-a14d-90f3269ff26f-kube-api-access-kzt6d\") pod \"nova-api-0\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.648084 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebecc338-537f-4b31-b992-9cc46c89ea19-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebecc338-537f-4b31-b992-9cc46c89ea19\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.648182 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.648794 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71fc75d-0a11-4673-a14d-90f3269ff26f-logs\") pod \"nova-api-0\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.653317 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71fc75d-0a11-4673-a14d-90f3269ff26f-config-data\") pod \"nova-api-0\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.655194 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71fc75d-0a11-4673-a14d-90f3269ff26f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.679646 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzt6d\" (UniqueName: \"kubernetes.io/projected/c71fc75d-0a11-4673-a14d-90f3269ff26f-kube-api-access-kzt6d\") pod \"nova-api-0\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.749784 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebecc338-537f-4b31-b992-9cc46c89ea19-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebecc338-537f-4b31-b992-9cc46c89ea19\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.750055 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.750094 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-config\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.750122 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.750152 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebecc338-537f-4b31-b992-9cc46c89ea19-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebecc338-537f-4b31-b992-9cc46c89ea19\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.750177 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.750244 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2z4f\" (UniqueName: \"kubernetes.io/projected/a4f23dd7-0459-4c71-86af-7b589d466e9d-kube-api-access-j2z4f\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.750287 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh55j\" (UniqueName: \"kubernetes.io/projected/ebecc338-537f-4b31-b992-9cc46c89ea19-kube-api-access-gh55j\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebecc338-537f-4b31-b992-9cc46c89ea19\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.750344 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-dns-svc\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.757368 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.767671 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.782370 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-dns-svc\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.782743 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebecc338-537f-4b31-b992-9cc46c89ea19-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebecc338-537f-4b31-b992-9cc46c89ea19\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.783149 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-config\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.783180 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.783276 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh55j\" (UniqueName: \"kubernetes.io/projected/ebecc338-537f-4b31-b992-9cc46c89ea19-kube-api-access-gh55j\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebecc338-537f-4b31-b992-9cc46c89ea19\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.784918 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebecc338-537f-4b31-b992-9cc46c89ea19-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebecc338-537f-4b31-b992-9cc46c89ea19\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.786962 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2z4f\" (UniqueName: \"kubernetes.io/projected/a4f23dd7-0459-4c71-86af-7b589d466e9d-kube-api-access-j2z4f\") pod \"dnsmasq-dns-865f5d856f-rtlp9\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.945329 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.957834 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:51 crc kubenswrapper[4991]: I1006 08:40:51.982358 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.058472 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fswkr"] Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.130268 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-smsnb"] Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.134202 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.137748 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.138083 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.146019 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-smsnb"] Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.200846 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:40:52 crc kubenswrapper[4991]: W1006 08:40:52.236304 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50e0cec2_978e_4758_9881_c2ae6db0b8c5.slice/crio-3b0ac81739f0ba38179dc4646c80ace753732c50d96fcba09a8014c4c088d9a1 WatchSource:0}: Error finding container 3b0ac81739f0ba38179dc4646c80ace753732c50d96fcba09a8014c4c088d9a1: Status 404 returned error can't find the container with id 3b0ac81739f0ba38179dc4646c80ace753732c50d96fcba09a8014c4c088d9a1 Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.250694 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.257117 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fswkr" event={"ID":"ec36c4e8-0d7b-4570-bb22-16367889063f","Type":"ContainerStarted","Data":"76a725cdce59c6fb3484d95b32f735b90073fc9673da255721cbeaf0a296c7b5"} Oct 06 08:40:52 crc kubenswrapper[4991]: W1006 08:40:52.266205 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce14ecce_dbea_4d36_8308_6cc10f920cb5.slice/crio-b916c2c566e65ef83401103f3b44d307a9b7dd0425d1ee5d11f4ae7c822611e2 WatchSource:0}: Error finding container b916c2c566e65ef83401103f3b44d307a9b7dd0425d1ee5d11f4ae7c822611e2: Status 404 returned error can't find the container with id b916c2c566e65ef83401103f3b44d307a9b7dd0425d1ee5d11f4ae7c822611e2 Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.272486 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-scripts\") pod \"nova-cell1-conductor-db-sync-smsnb\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.272565 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzhm8\" (UniqueName: \"kubernetes.io/projected/ea12773e-89f6-478d-92f5-23bfb4a05a6a-kube-api-access-tzhm8\") pod \"nova-cell1-conductor-db-sync-smsnb\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.272600 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-smsnb\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.272728 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-config-data\") pod \"nova-cell1-conductor-db-sync-smsnb\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.374577 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-scripts\") pod \"nova-cell1-conductor-db-sync-smsnb\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.374644 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzhm8\" (UniqueName: \"kubernetes.io/projected/ea12773e-89f6-478d-92f5-23bfb4a05a6a-kube-api-access-tzhm8\") pod \"nova-cell1-conductor-db-sync-smsnb\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.374695 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-smsnb\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.374820 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-config-data\") pod \"nova-cell1-conductor-db-sync-smsnb\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.380560 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-scripts\") pod \"nova-cell1-conductor-db-sync-smsnb\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.380950 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-smsnb\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.389389 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-config-data\") pod \"nova-cell1-conductor-db-sync-smsnb\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.391834 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzhm8\" (UniqueName: \"kubernetes.io/projected/ea12773e-89f6-478d-92f5-23bfb4a05a6a-kube-api-access-tzhm8\") pod \"nova-cell1-conductor-db-sync-smsnb\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.474418 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.545568 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:40:52 crc kubenswrapper[4991]: W1006 08:40:52.568641 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc71fc75d_0a11_4673_a14d_90f3269ff26f.slice/crio-cfe21ba219347645960b422708293d79554a8bd271b712cf6fad6a029f3aa597 WatchSource:0}: Error finding container cfe21ba219347645960b422708293d79554a8bd271b712cf6fad6a029f3aa597: Status 404 returned error can't find the container with id cfe21ba219347645960b422708293d79554a8bd271b712cf6fad6a029f3aa597 Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.652273 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:40:52 crc kubenswrapper[4991]: W1006 08:40:52.660871 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebecc338_537f_4b31_b992_9cc46c89ea19.slice/crio-199f23db57a5cfa0b7e10ef818bfa256ec4fcb89ff9c4a0c2fb11fca82dba413 WatchSource:0}: Error finding container 199f23db57a5cfa0b7e10ef818bfa256ec4fcb89ff9c4a0c2fb11fca82dba413: Status 404 returned error can't find the container with id 199f23db57a5cfa0b7e10ef818bfa256ec4fcb89ff9c4a0c2fb11fca82dba413 Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.674006 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-rtlp9"] Oct 06 08:40:52 crc kubenswrapper[4991]: I1006 08:40:52.981627 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-smsnb"] Oct 06 08:40:53 crc kubenswrapper[4991]: W1006 08:40:53.013959 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea12773e_89f6_478d_92f5_23bfb4a05a6a.slice/crio-caad4f004c8746783d23cbdd0590a1bd7e73f901af22b44ecfbbfbfa9512a58d WatchSource:0}: Error finding container caad4f004c8746783d23cbdd0590a1bd7e73f901af22b44ecfbbfbfa9512a58d: Status 404 returned error can't find the container with id caad4f004c8746783d23cbdd0590a1bd7e73f901af22b44ecfbbfbfa9512a58d Oct 06 08:40:53 crc kubenswrapper[4991]: I1006 08:40:53.280319 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50e0cec2-978e-4758-9881-c2ae6db0b8c5","Type":"ContainerStarted","Data":"3b0ac81739f0ba38179dc4646c80ace753732c50d96fcba09a8014c4c088d9a1"} Oct 06 08:40:53 crc kubenswrapper[4991]: I1006 08:40:53.281170 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ce14ecce-dbea-4d36-8308-6cc10f920cb5","Type":"ContainerStarted","Data":"b916c2c566e65ef83401103f3b44d307a9b7dd0425d1ee5d11f4ae7c822611e2"} Oct 06 08:40:53 crc kubenswrapper[4991]: I1006 08:40:53.284231 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-smsnb" event={"ID":"ea12773e-89f6-478d-92f5-23bfb4a05a6a","Type":"ContainerStarted","Data":"caad4f004c8746783d23cbdd0590a1bd7e73f901af22b44ecfbbfbfa9512a58d"} Oct 06 08:40:53 crc kubenswrapper[4991]: I1006 08:40:53.286945 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c71fc75d-0a11-4673-a14d-90f3269ff26f","Type":"ContainerStarted","Data":"cfe21ba219347645960b422708293d79554a8bd271b712cf6fad6a029f3aa597"} Oct 06 08:40:53 crc kubenswrapper[4991]: I1006 08:40:53.289118 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fswkr" event={"ID":"ec36c4e8-0d7b-4570-bb22-16367889063f","Type":"ContainerStarted","Data":"1428ac64bc5e21255061577f68b26d4466a1e854d5bb4746502e41b12d412d03"} Oct 06 08:40:53 crc kubenswrapper[4991]: I1006 08:40:53.298728 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ebecc338-537f-4b31-b992-9cc46c89ea19","Type":"ContainerStarted","Data":"199f23db57a5cfa0b7e10ef818bfa256ec4fcb89ff9c4a0c2fb11fca82dba413"} Oct 06 08:40:53 crc kubenswrapper[4991]: I1006 08:40:53.303484 4991 generic.go:334] "Generic (PLEG): container finished" podID="a4f23dd7-0459-4c71-86af-7b589d466e9d" containerID="8d08c1c46e9dbb33c0fcf110f50fb8a57ea2588d6e3f2a5e95349068fb7c093c" exitCode=0 Oct 06 08:40:53 crc kubenswrapper[4991]: I1006 08:40:53.303537 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" event={"ID":"a4f23dd7-0459-4c71-86af-7b589d466e9d","Type":"ContainerDied","Data":"8d08c1c46e9dbb33c0fcf110f50fb8a57ea2588d6e3f2a5e95349068fb7c093c"} Oct 06 08:40:53 crc kubenswrapper[4991]: I1006 08:40:53.303567 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" event={"ID":"a4f23dd7-0459-4c71-86af-7b589d466e9d","Type":"ContainerStarted","Data":"c40b4d594cbc2a44239e5ded1aca2ab811b0840e59efb067f1c2cab61e24e0ca"} Oct 06 08:40:53 crc kubenswrapper[4991]: I1006 08:40:53.310701 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-fswkr" podStartSLOduration=2.310683141 podStartE2EDuration="2.310683141s" podCreationTimestamp="2025-10-06 08:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:53.303256023 +0000 UTC m=+1305.041006074" watchObservedRunningTime="2025-10-06 08:40:53.310683141 +0000 UTC m=+1305.048433162" Oct 06 08:40:54 crc kubenswrapper[4991]: I1006 08:40:54.349560 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-smsnb" event={"ID":"ea12773e-89f6-478d-92f5-23bfb4a05a6a","Type":"ContainerStarted","Data":"cbef79c571778565a610536c8feaba7e67321af73c156f93f621ec4d91f65fe9"} Oct 06 08:40:54 crc kubenswrapper[4991]: I1006 08:40:54.367585 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" event={"ID":"a4f23dd7-0459-4c71-86af-7b589d466e9d","Type":"ContainerStarted","Data":"90cd40de1bdce0b9010647126aca623edc030c70b8cf3e25c080af7f4f0d06b5"} Oct 06 08:40:54 crc kubenswrapper[4991]: I1006 08:40:54.367879 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:40:54 crc kubenswrapper[4991]: I1006 08:40:54.379693 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-smsnb" podStartSLOduration=2.379678132 podStartE2EDuration="2.379678132s" podCreationTimestamp="2025-10-06 08:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:54.377763198 +0000 UTC m=+1306.115513219" watchObservedRunningTime="2025-10-06 08:40:54.379678132 +0000 UTC m=+1306.117428153" Oct 06 08:40:54 crc kubenswrapper[4991]: I1006 08:40:54.400659 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50e0cec2-978e-4758-9881-c2ae6db0b8c5","Type":"ContainerStarted","Data":"ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b"} Oct 06 08:40:54 crc kubenswrapper[4991]: I1006 08:40:54.418345 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" podStartSLOduration=3.418324703 podStartE2EDuration="3.418324703s" podCreationTimestamp="2025-10-06 08:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:54.409595209 +0000 UTC m=+1306.147345230" watchObservedRunningTime="2025-10-06 08:40:54.418324703 +0000 UTC m=+1306.156074724" Oct 06 08:40:54 crc kubenswrapper[4991]: I1006 08:40:54.447430 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.164196692 podStartE2EDuration="3.447410308s" podCreationTimestamp="2025-10-06 08:40:51 +0000 UTC" firstStartedPulling="2025-10-06 08:40:52.254834259 +0000 UTC m=+1303.992584290" lastFinishedPulling="2025-10-06 08:40:53.538047885 +0000 UTC m=+1305.275797906" observedRunningTime="2025-10-06 08:40:54.432673774 +0000 UTC m=+1306.170423795" watchObservedRunningTime="2025-10-06 08:40:54.447410308 +0000 UTC m=+1306.185160329" Oct 06 08:40:54 crc kubenswrapper[4991]: I1006 08:40:54.939407 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:54 crc kubenswrapper[4991]: I1006 08:40:54.949576 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.420728 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ce14ecce-dbea-4d36-8308-6cc10f920cb5","Type":"ContainerStarted","Data":"0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188"} Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.421267 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ce14ecce-dbea-4d36-8308-6cc10f920cb5","Type":"ContainerStarted","Data":"38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19"} Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.420865 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ce14ecce-dbea-4d36-8308-6cc10f920cb5" containerName="nova-metadata-metadata" containerID="cri-o://0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188" gracePeriod=30 Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.420799 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ce14ecce-dbea-4d36-8308-6cc10f920cb5" containerName="nova-metadata-log" containerID="cri-o://38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19" gracePeriod=30 Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.423042 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c71fc75d-0a11-4673-a14d-90f3269ff26f","Type":"ContainerStarted","Data":"5b868b9187832c2005be815975b42773e519e81fb95f561cfb8a51e94477be13"} Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.423100 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c71fc75d-0a11-4673-a14d-90f3269ff26f","Type":"ContainerStarted","Data":"62e47d84171c7222528d366552dba6e76ba86c5bd424f4e5ce7c51dc4772d323"} Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.432399 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ebecc338-537f-4b31-b992-9cc46c89ea19" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://afaad24da82e9977eb0954a81eb93a35ce855f528655c94b4ae6d47f4f212c3d" gracePeriod=30 Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.432774 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ebecc338-537f-4b31-b992-9cc46c89ea19","Type":"ContainerStarted","Data":"afaad24da82e9977eb0954a81eb93a35ce855f528655c94b4ae6d47f4f212c3d"} Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.441692 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.558381355 podStartE2EDuration="5.441668195s" podCreationTimestamp="2025-10-06 08:40:51 +0000 UTC" firstStartedPulling="2025-10-06 08:40:52.268145232 +0000 UTC m=+1304.005895253" lastFinishedPulling="2025-10-06 08:40:55.151432082 +0000 UTC m=+1306.889182093" observedRunningTime="2025-10-06 08:40:56.438548538 +0000 UTC m=+1308.176298559" watchObservedRunningTime="2025-10-06 08:40:56.441668195 +0000 UTC m=+1308.179418216" Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.465165 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.982672791 podStartE2EDuration="5.465148172s" podCreationTimestamp="2025-10-06 08:40:51 +0000 UTC" firstStartedPulling="2025-10-06 08:40:52.66893057 +0000 UTC m=+1304.406680591" lastFinishedPulling="2025-10-06 08:40:55.151405951 +0000 UTC m=+1306.889155972" observedRunningTime="2025-10-06 08:40:56.458388763 +0000 UTC m=+1308.196138784" watchObservedRunningTime="2025-10-06 08:40:56.465148172 +0000 UTC m=+1308.202898193" Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.480750 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.896361493 podStartE2EDuration="5.480732438s" podCreationTimestamp="2025-10-06 08:40:51 +0000 UTC" firstStartedPulling="2025-10-06 08:40:52.571386229 +0000 UTC m=+1304.309136250" lastFinishedPulling="2025-10-06 08:40:55.155757174 +0000 UTC m=+1306.893507195" observedRunningTime="2025-10-06 08:40:56.472656522 +0000 UTC m=+1308.210406533" watchObservedRunningTime="2025-10-06 08:40:56.480732438 +0000 UTC m=+1308.218482459" Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.552706 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.634785 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.634837 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 08:40:56 crc kubenswrapper[4991]: I1006 08:40:56.958084 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.021588 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.089842 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce14ecce-dbea-4d36-8308-6cc10f920cb5-combined-ca-bundle\") pod \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.090035 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce14ecce-dbea-4d36-8308-6cc10f920cb5-logs\") pod \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.090078 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce14ecce-dbea-4d36-8308-6cc10f920cb5-config-data\") pod \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.090191 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9gfl\" (UniqueName: \"kubernetes.io/projected/ce14ecce-dbea-4d36-8308-6cc10f920cb5-kube-api-access-g9gfl\") pod \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\" (UID: \"ce14ecce-dbea-4d36-8308-6cc10f920cb5\") " Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.092837 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce14ecce-dbea-4d36-8308-6cc10f920cb5-logs" (OuterVolumeSpecName: "logs") pod "ce14ecce-dbea-4d36-8308-6cc10f920cb5" (UID: "ce14ecce-dbea-4d36-8308-6cc10f920cb5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.099918 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce14ecce-dbea-4d36-8308-6cc10f920cb5-kube-api-access-g9gfl" (OuterVolumeSpecName: "kube-api-access-g9gfl") pod "ce14ecce-dbea-4d36-8308-6cc10f920cb5" (UID: "ce14ecce-dbea-4d36-8308-6cc10f920cb5"). InnerVolumeSpecName "kube-api-access-g9gfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.127581 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce14ecce-dbea-4d36-8308-6cc10f920cb5-config-data" (OuterVolumeSpecName: "config-data") pod "ce14ecce-dbea-4d36-8308-6cc10f920cb5" (UID: "ce14ecce-dbea-4d36-8308-6cc10f920cb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.133002 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce14ecce-dbea-4d36-8308-6cc10f920cb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce14ecce-dbea-4d36-8308-6cc10f920cb5" (UID: "ce14ecce-dbea-4d36-8308-6cc10f920cb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.193033 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce14ecce-dbea-4d36-8308-6cc10f920cb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.193068 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce14ecce-dbea-4d36-8308-6cc10f920cb5-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.193078 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce14ecce-dbea-4d36-8308-6cc10f920cb5-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.193088 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9gfl\" (UniqueName: \"kubernetes.io/projected/ce14ecce-dbea-4d36-8308-6cc10f920cb5-kube-api-access-g9gfl\") on node \"crc\" DevicePath \"\"" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.442752 4991 generic.go:334] "Generic (PLEG): container finished" podID="ce14ecce-dbea-4d36-8308-6cc10f920cb5" containerID="0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188" exitCode=0 Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.442788 4991 generic.go:334] "Generic (PLEG): container finished" podID="ce14ecce-dbea-4d36-8308-6cc10f920cb5" containerID="38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19" exitCode=143 Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.443849 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.444311 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ce14ecce-dbea-4d36-8308-6cc10f920cb5","Type":"ContainerDied","Data":"0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188"} Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.444473 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ce14ecce-dbea-4d36-8308-6cc10f920cb5","Type":"ContainerDied","Data":"38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19"} Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.444578 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ce14ecce-dbea-4d36-8308-6cc10f920cb5","Type":"ContainerDied","Data":"b916c2c566e65ef83401103f3b44d307a9b7dd0425d1ee5d11f4ae7c822611e2"} Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.444701 4991 scope.go:117] "RemoveContainer" containerID="0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.472672 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.477964 4991 scope.go:117] "RemoveContainer" containerID="38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.484172 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.493518 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:57 crc kubenswrapper[4991]: E1006 08:40:57.493979 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce14ecce-dbea-4d36-8308-6cc10f920cb5" containerName="nova-metadata-log" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.494002 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce14ecce-dbea-4d36-8308-6cc10f920cb5" containerName="nova-metadata-log" Oct 06 08:40:57 crc kubenswrapper[4991]: E1006 08:40:57.494033 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce14ecce-dbea-4d36-8308-6cc10f920cb5" containerName="nova-metadata-metadata" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.494042 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce14ecce-dbea-4d36-8308-6cc10f920cb5" containerName="nova-metadata-metadata" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.494311 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce14ecce-dbea-4d36-8308-6cc10f920cb5" containerName="nova-metadata-log" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.494383 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce14ecce-dbea-4d36-8308-6cc10f920cb5" containerName="nova-metadata-metadata" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.496403 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.501673 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.506792 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.509819 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.513665 4991 scope.go:117] "RemoveContainer" containerID="0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188" Oct 06 08:40:57 crc kubenswrapper[4991]: E1006 08:40:57.527070 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188\": container with ID starting with 0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188 not found: ID does not exist" containerID="0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.527327 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188"} err="failed to get container status \"0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188\": rpc error: code = NotFound desc = could not find container \"0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188\": container with ID starting with 0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188 not found: ID does not exist" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.527457 4991 scope.go:117] "RemoveContainer" containerID="38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.528510 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.528707 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.528856 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.529795 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"588bca8d19a8065db7c6c040db1c1694b8c7daffc697ab9a2f8788b4b3c06abd"} pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.529968 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" containerID="cri-o://588bca8d19a8065db7c6c040db1c1694b8c7daffc697ab9a2f8788b4b3c06abd" gracePeriod=600 Oct 06 08:40:57 crc kubenswrapper[4991]: E1006 08:40:57.528659 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19\": container with ID starting with 38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19 not found: ID does not exist" containerID="38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.530311 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19"} err="failed to get container status \"38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19\": rpc error: code = NotFound desc = could not find container \"38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19\": container with ID starting with 38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19 not found: ID does not exist" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.530430 4991 scope.go:117] "RemoveContainer" containerID="0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.531776 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188"} err="failed to get container status \"0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188\": rpc error: code = NotFound desc = could not find container \"0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188\": container with ID starting with 0a8edf8248092298a72609cc118f61b0e0676f35a85aa404b366c2c6cab53188 not found: ID does not exist" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.531895 4991 scope.go:117] "RemoveContainer" containerID="38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.532269 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19"} err="failed to get container status \"38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19\": rpc error: code = NotFound desc = could not find container \"38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19\": container with ID starting with 38cc33f86a854a0622c66fa444684e6c07318c40239b4cb3f6d1a5723e3d8b19 not found: ID does not exist" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.606649 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf8216a7-a91a-4b96-88f8-f12c836ba326-logs\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.607386 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.607528 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vxhm\" (UniqueName: \"kubernetes.io/projected/bf8216a7-a91a-4b96-88f8-f12c836ba326-kube-api-access-4vxhm\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.607644 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-config-data\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.607813 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.709307 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf8216a7-a91a-4b96-88f8-f12c836ba326-logs\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.709864 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.710077 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vxhm\" (UniqueName: \"kubernetes.io/projected/bf8216a7-a91a-4b96-88f8-f12c836ba326-kube-api-access-4vxhm\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.710496 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-config-data\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.710954 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.710042 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf8216a7-a91a-4b96-88f8-f12c836ba326-logs\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.716394 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-config-data\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.716674 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.719859 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.729602 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vxhm\" (UniqueName: \"kubernetes.io/projected/bf8216a7-a91a-4b96-88f8-f12c836ba326-kube-api-access-4vxhm\") pod \"nova-metadata-0\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " pod="openstack/nova-metadata-0" Oct 06 08:40:57 crc kubenswrapper[4991]: I1006 08:40:57.827589 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:40:58 crc kubenswrapper[4991]: I1006 08:40:58.266350 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:40:58 crc kubenswrapper[4991]: W1006 08:40:58.276662 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf8216a7_a91a_4b96_88f8_f12c836ba326.slice/crio-2188e0335cc18f6e064608a19f348a65e0862ec580dc419a95d9a758739cc25a WatchSource:0}: Error finding container 2188e0335cc18f6e064608a19f348a65e0862ec580dc419a95d9a758739cc25a: Status 404 returned error can't find the container with id 2188e0335cc18f6e064608a19f348a65e0862ec580dc419a95d9a758739cc25a Oct 06 08:40:58 crc kubenswrapper[4991]: I1006 08:40:58.455894 4991 generic.go:334] "Generic (PLEG): container finished" podID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerID="588bca8d19a8065db7c6c040db1c1694b8c7daffc697ab9a2f8788b4b3c06abd" exitCode=0 Oct 06 08:40:58 crc kubenswrapper[4991]: I1006 08:40:58.455993 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerDied","Data":"588bca8d19a8065db7c6c040db1c1694b8c7daffc697ab9a2f8788b4b3c06abd"} Oct 06 08:40:58 crc kubenswrapper[4991]: I1006 08:40:58.456303 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36"} Oct 06 08:40:58 crc kubenswrapper[4991]: I1006 08:40:58.456352 4991 scope.go:117] "RemoveContainer" containerID="e1369062046a805994e1e0d5f87b5a6e887447735123010879df4c4305faa2ba" Oct 06 08:40:58 crc kubenswrapper[4991]: I1006 08:40:58.460550 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf8216a7-a91a-4b96-88f8-f12c836ba326","Type":"ContainerStarted","Data":"2188e0335cc18f6e064608a19f348a65e0862ec580dc419a95d9a758739cc25a"} Oct 06 08:40:59 crc kubenswrapper[4991]: I1006 08:40:59.260828 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce14ecce-dbea-4d36-8308-6cc10f920cb5" path="/var/lib/kubelet/pods/ce14ecce-dbea-4d36-8308-6cc10f920cb5/volumes" Oct 06 08:40:59 crc kubenswrapper[4991]: I1006 08:40:59.476098 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf8216a7-a91a-4b96-88f8-f12c836ba326","Type":"ContainerStarted","Data":"01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991"} Oct 06 08:40:59 crc kubenswrapper[4991]: I1006 08:40:59.476154 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf8216a7-a91a-4b96-88f8-f12c836ba326","Type":"ContainerStarted","Data":"15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636"} Oct 06 08:41:00 crc kubenswrapper[4991]: I1006 08:41:00.492977 4991 generic.go:334] "Generic (PLEG): container finished" podID="ec36c4e8-0d7b-4570-bb22-16367889063f" containerID="1428ac64bc5e21255061577f68b26d4466a1e854d5bb4746502e41b12d412d03" exitCode=0 Oct 06 08:41:00 crc kubenswrapper[4991]: I1006 08:41:00.493071 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fswkr" event={"ID":"ec36c4e8-0d7b-4570-bb22-16367889063f","Type":"ContainerDied","Data":"1428ac64bc5e21255061577f68b26d4466a1e854d5bb4746502e41b12d412d03"} Oct 06 08:41:00 crc kubenswrapper[4991]: I1006 08:41:00.519774 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.519757416 podStartE2EDuration="3.519757416s" podCreationTimestamp="2025-10-06 08:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:40:59.50485829 +0000 UTC m=+1311.242608321" watchObservedRunningTime="2025-10-06 08:41:00.519757416 +0000 UTC m=+1312.257507437" Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.509872 4991 generic.go:334] "Generic (PLEG): container finished" podID="ea12773e-89f6-478d-92f5-23bfb4a05a6a" containerID="cbef79c571778565a610536c8feaba7e67321af73c156f93f621ec4d91f65fe9" exitCode=0 Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.510021 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-smsnb" event={"ID":"ea12773e-89f6-478d-92f5-23bfb4a05a6a","Type":"ContainerDied","Data":"cbef79c571778565a610536c8feaba7e67321af73c156f93f621ec4d91f65fe9"} Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.555222 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.606536 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.933074 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.945590 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.945653 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.984551 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.989487 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-combined-ca-bundle\") pod \"ec36c4e8-0d7b-4570-bb22-16367889063f\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.989624 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-config-data\") pod \"ec36c4e8-0d7b-4570-bb22-16367889063f\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.989706 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-scripts\") pod \"ec36c4e8-0d7b-4570-bb22-16367889063f\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.989741 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzkvx\" (UniqueName: \"kubernetes.io/projected/ec36c4e8-0d7b-4570-bb22-16367889063f-kube-api-access-nzkvx\") pod \"ec36c4e8-0d7b-4570-bb22-16367889063f\" (UID: \"ec36c4e8-0d7b-4570-bb22-16367889063f\") " Oct 06 08:41:01 crc kubenswrapper[4991]: I1006 08:41:01.998147 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec36c4e8-0d7b-4570-bb22-16367889063f-kube-api-access-nzkvx" (OuterVolumeSpecName: "kube-api-access-nzkvx") pod "ec36c4e8-0d7b-4570-bb22-16367889063f" (UID: "ec36c4e8-0d7b-4570-bb22-16367889063f"). InnerVolumeSpecName "kube-api-access-nzkvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.001472 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-scripts" (OuterVolumeSpecName: "scripts") pod "ec36c4e8-0d7b-4570-bb22-16367889063f" (UID: "ec36c4e8-0d7b-4570-bb22-16367889063f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.038009 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec36c4e8-0d7b-4570-bb22-16367889063f" (UID: "ec36c4e8-0d7b-4570-bb22-16367889063f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.048101 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-config-data" (OuterVolumeSpecName: "config-data") pod "ec36c4e8-0d7b-4570-bb22-16367889063f" (UID: "ec36c4e8-0d7b-4570-bb22-16367889063f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.049356 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-79twg"] Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.049628 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" podUID="0f68ff49-be6e-460f-91e6-ec7d260e0aff" containerName="dnsmasq-dns" containerID="cri-o://ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3" gracePeriod=10 Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.091885 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.091920 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.091929 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzkvx\" (UniqueName: \"kubernetes.io/projected/ec36c4e8-0d7b-4570-bb22-16367889063f-kube-api-access-nzkvx\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.091938 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec36c4e8-0d7b-4570-bb22-16367889063f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.495718 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.527637 4991 generic.go:334] "Generic (PLEG): container finished" podID="0f68ff49-be6e-460f-91e6-ec7d260e0aff" containerID="ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3" exitCode=0 Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.527708 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" event={"ID":"0f68ff49-be6e-460f-91e6-ec7d260e0aff","Type":"ContainerDied","Data":"ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3"} Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.527735 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" event={"ID":"0f68ff49-be6e-460f-91e6-ec7d260e0aff","Type":"ContainerDied","Data":"21328eefffe50960b453c10904ea78ee8e4abe5a8c6152acf5097cffd82244ca"} Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.527755 4991 scope.go:117] "RemoveContainer" containerID="ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.527885 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-79twg" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.539363 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fswkr" event={"ID":"ec36c4e8-0d7b-4570-bb22-16367889063f","Type":"ContainerDied","Data":"76a725cdce59c6fb3484d95b32f735b90073fc9673da255721cbeaf0a296c7b5"} Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.539388 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76a725cdce59c6fb3484d95b32f735b90073fc9673da255721cbeaf0a296c7b5" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.539301 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fswkr" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.564384 4991 scope.go:117] "RemoveContainer" containerID="923f5be9c04c65e2efbd24f3d3427b1c024bf3f30fc4d1912cc3c429e3c344cd" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.590176 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.600286 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-dns-swift-storage-0\") pod \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.600375 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-ovsdbserver-sb\") pod \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.601009 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-config\") pod \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.601071 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-dns-svc\") pod \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.601157 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-ovsdbserver-nb\") pod \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.601183 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb4d8\" (UniqueName: \"kubernetes.io/projected/0f68ff49-be6e-460f-91e6-ec7d260e0aff-kube-api-access-jb4d8\") pod \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\" (UID: \"0f68ff49-be6e-460f-91e6-ec7d260e0aff\") " Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.611480 4991 scope.go:117] "RemoveContainer" containerID="ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3" Oct 06 08:41:02 crc kubenswrapper[4991]: E1006 08:41:02.612830 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3\": container with ID starting with ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3 not found: ID does not exist" containerID="ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.612877 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3"} err="failed to get container status \"ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3\": rpc error: code = NotFound desc = could not find container \"ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3\": container with ID starting with ddddafef1b7d9bd1c65ab2a278430b4f567ceec29d02540dda9cccf6e10b69a3 not found: ID does not exist" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.612902 4991 scope.go:117] "RemoveContainer" containerID="923f5be9c04c65e2efbd24f3d3427b1c024bf3f30fc4d1912cc3c429e3c344cd" Oct 06 08:41:02 crc kubenswrapper[4991]: E1006 08:41:02.613264 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"923f5be9c04c65e2efbd24f3d3427b1c024bf3f30fc4d1912cc3c429e3c344cd\": container with ID starting with 923f5be9c04c65e2efbd24f3d3427b1c024bf3f30fc4d1912cc3c429e3c344cd not found: ID does not exist" containerID="923f5be9c04c65e2efbd24f3d3427b1c024bf3f30fc4d1912cc3c429e3c344cd" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.613288 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923f5be9c04c65e2efbd24f3d3427b1c024bf3f30fc4d1912cc3c429e3c344cd"} err="failed to get container status \"923f5be9c04c65e2efbd24f3d3427b1c024bf3f30fc4d1912cc3c429e3c344cd\": rpc error: code = NotFound desc = could not find container \"923f5be9c04c65e2efbd24f3d3427b1c024bf3f30fc4d1912cc3c429e3c344cd\": container with ID starting with 923f5be9c04c65e2efbd24f3d3427b1c024bf3f30fc4d1912cc3c429e3c344cd not found: ID does not exist" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.615837 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f68ff49-be6e-460f-91e6-ec7d260e0aff-kube-api-access-jb4d8" (OuterVolumeSpecName: "kube-api-access-jb4d8") pod "0f68ff49-be6e-460f-91e6-ec7d260e0aff" (UID: "0f68ff49-be6e-460f-91e6-ec7d260e0aff"). InnerVolumeSpecName "kube-api-access-jb4d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.672186 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f68ff49-be6e-460f-91e6-ec7d260e0aff" (UID: "0f68ff49-be6e-460f-91e6-ec7d260e0aff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.690772 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-config" (OuterVolumeSpecName: "config") pod "0f68ff49-be6e-460f-91e6-ec7d260e0aff" (UID: "0f68ff49-be6e-460f-91e6-ec7d260e0aff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.697750 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f68ff49-be6e-460f-91e6-ec7d260e0aff" (UID: "0f68ff49-be6e-460f-91e6-ec7d260e0aff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.698687 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f68ff49-be6e-460f-91e6-ec7d260e0aff" (UID: "0f68ff49-be6e-460f-91e6-ec7d260e0aff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.703458 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.703486 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb4d8\" (UniqueName: \"kubernetes.io/projected/0f68ff49-be6e-460f-91e6-ec7d260e0aff-kube-api-access-jb4d8\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.703498 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.703507 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.703517 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.708280 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0f68ff49-be6e-460f-91e6-ec7d260e0aff" (UID: "0f68ff49-be6e-460f-91e6-ec7d260e0aff"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.778091 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.779715 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c71fc75d-0a11-4673-a14d-90f3269ff26f" containerName="nova-api-log" containerID="cri-o://62e47d84171c7222528d366552dba6e76ba86c5bd424f4e5ce7c51dc4772d323" gracePeriod=30 Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.780172 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c71fc75d-0a11-4673-a14d-90f3269ff26f" containerName="nova-api-api" containerID="cri-o://5b868b9187832c2005be815975b42773e519e81fb95f561cfb8a51e94477be13" gracePeriod=30 Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.787554 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c71fc75d-0a11-4673-a14d-90f3269ff26f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": EOF" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.787581 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c71fc75d-0a11-4673-a14d-90f3269ff26f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": EOF" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.818781 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.819284 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bf8216a7-a91a-4b96-88f8-f12c836ba326" containerName="nova-metadata-log" containerID="cri-o://15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636" gracePeriod=30 Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.820060 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bf8216a7-a91a-4b96-88f8-f12c836ba326" containerName="nova-metadata-metadata" containerID="cri-o://01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991" gracePeriod=30 Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.822179 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f68ff49-be6e-460f-91e6-ec7d260e0aff-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.829899 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 08:41:02 crc kubenswrapper[4991]: I1006 08:41:02.829967 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.056905 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.097543 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.124057 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-79twg"] Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.132852 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-79twg"] Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.227585 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-combined-ca-bundle\") pod \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.227639 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-config-data\") pod \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.227685 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzhm8\" (UniqueName: \"kubernetes.io/projected/ea12773e-89f6-478d-92f5-23bfb4a05a6a-kube-api-access-tzhm8\") pod \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.227710 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-scripts\") pod \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\" (UID: \"ea12773e-89f6-478d-92f5-23bfb4a05a6a\") " Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.232903 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea12773e-89f6-478d-92f5-23bfb4a05a6a-kube-api-access-tzhm8" (OuterVolumeSpecName: "kube-api-access-tzhm8") pod "ea12773e-89f6-478d-92f5-23bfb4a05a6a" (UID: "ea12773e-89f6-478d-92f5-23bfb4a05a6a"). InnerVolumeSpecName "kube-api-access-tzhm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.233467 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-scripts" (OuterVolumeSpecName: "scripts") pod "ea12773e-89f6-478d-92f5-23bfb4a05a6a" (UID: "ea12773e-89f6-478d-92f5-23bfb4a05a6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.265196 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea12773e-89f6-478d-92f5-23bfb4a05a6a" (UID: "ea12773e-89f6-478d-92f5-23bfb4a05a6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.265711 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-config-data" (OuterVolumeSpecName: "config-data") pod "ea12773e-89f6-478d-92f5-23bfb4a05a6a" (UID: "ea12773e-89f6-478d-92f5-23bfb4a05a6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.267756 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f68ff49-be6e-460f-91e6-ec7d260e0aff" path="/var/lib/kubelet/pods/0f68ff49-be6e-460f-91e6-ec7d260e0aff/volumes" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.335417 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.335477 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.335492 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzhm8\" (UniqueName: \"kubernetes.io/projected/ea12773e-89f6-478d-92f5-23bfb4a05a6a-kube-api-access-tzhm8\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.335507 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea12773e-89f6-478d-92f5-23bfb4a05a6a-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.386222 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.437319 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-nova-metadata-tls-certs\") pod \"bf8216a7-a91a-4b96-88f8-f12c836ba326\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.437379 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vxhm\" (UniqueName: \"kubernetes.io/projected/bf8216a7-a91a-4b96-88f8-f12c836ba326-kube-api-access-4vxhm\") pod \"bf8216a7-a91a-4b96-88f8-f12c836ba326\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.437420 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf8216a7-a91a-4b96-88f8-f12c836ba326-logs\") pod \"bf8216a7-a91a-4b96-88f8-f12c836ba326\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.437456 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-config-data\") pod \"bf8216a7-a91a-4b96-88f8-f12c836ba326\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.437491 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-combined-ca-bundle\") pod \"bf8216a7-a91a-4b96-88f8-f12c836ba326\" (UID: \"bf8216a7-a91a-4b96-88f8-f12c836ba326\") " Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.438471 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf8216a7-a91a-4b96-88f8-f12c836ba326-logs" (OuterVolumeSpecName: "logs") pod "bf8216a7-a91a-4b96-88f8-f12c836ba326" (UID: "bf8216a7-a91a-4b96-88f8-f12c836ba326"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.442480 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf8216a7-a91a-4b96-88f8-f12c836ba326-kube-api-access-4vxhm" (OuterVolumeSpecName: "kube-api-access-4vxhm") pod "bf8216a7-a91a-4b96-88f8-f12c836ba326" (UID: "bf8216a7-a91a-4b96-88f8-f12c836ba326"). InnerVolumeSpecName "kube-api-access-4vxhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.462990 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf8216a7-a91a-4b96-88f8-f12c836ba326" (UID: "bf8216a7-a91a-4b96-88f8-f12c836ba326"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.467116 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-config-data" (OuterVolumeSpecName: "config-data") pod "bf8216a7-a91a-4b96-88f8-f12c836ba326" (UID: "bf8216a7-a91a-4b96-88f8-f12c836ba326"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.502426 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bf8216a7-a91a-4b96-88f8-f12c836ba326" (UID: "bf8216a7-a91a-4b96-88f8-f12c836ba326"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.539781 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf8216a7-a91a-4b96-88f8-f12c836ba326-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.539816 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.539829 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.539841 4991 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf8216a7-a91a-4b96-88f8-f12c836ba326-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.539855 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vxhm\" (UniqueName: \"kubernetes.io/projected/bf8216a7-a91a-4b96-88f8-f12c836ba326-kube-api-access-4vxhm\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.549560 4991 generic.go:334] "Generic (PLEG): container finished" podID="c71fc75d-0a11-4673-a14d-90f3269ff26f" containerID="62e47d84171c7222528d366552dba6e76ba86c5bd424f4e5ce7c51dc4772d323" exitCode=143 Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.549639 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c71fc75d-0a11-4673-a14d-90f3269ff26f","Type":"ContainerDied","Data":"62e47d84171c7222528d366552dba6e76ba86c5bd424f4e5ce7c51dc4772d323"} Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.551703 4991 generic.go:334] "Generic (PLEG): container finished" podID="bf8216a7-a91a-4b96-88f8-f12c836ba326" containerID="01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991" exitCode=0 Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.551724 4991 generic.go:334] "Generic (PLEG): container finished" podID="bf8216a7-a91a-4b96-88f8-f12c836ba326" containerID="15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636" exitCode=143 Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.551761 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.551768 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf8216a7-a91a-4b96-88f8-f12c836ba326","Type":"ContainerDied","Data":"01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991"} Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.551791 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf8216a7-a91a-4b96-88f8-f12c836ba326","Type":"ContainerDied","Data":"15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636"} Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.551803 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf8216a7-a91a-4b96-88f8-f12c836ba326","Type":"ContainerDied","Data":"2188e0335cc18f6e064608a19f348a65e0862ec580dc419a95d9a758739cc25a"} Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.551819 4991 scope.go:117] "RemoveContainer" containerID="01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.558658 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-smsnb" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.560424 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-smsnb" event={"ID":"ea12773e-89f6-478d-92f5-23bfb4a05a6a","Type":"ContainerDied","Data":"caad4f004c8746783d23cbdd0590a1bd7e73f901af22b44ecfbbfbfa9512a58d"} Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.567526 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caad4f004c8746783d23cbdd0590a1bd7e73f901af22b44ecfbbfbfa9512a58d" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.606621 4991 scope.go:117] "RemoveContainer" containerID="15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.610051 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.626540 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.636360 4991 scope.go:117] "RemoveContainer" containerID="01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991" Oct 06 08:41:03 crc kubenswrapper[4991]: E1006 08:41:03.638096 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991\": container with ID starting with 01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991 not found: ID does not exist" containerID="01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.638149 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991"} err="failed to get container status \"01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991\": rpc error: code = NotFound desc = could not find container \"01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991\": container with ID starting with 01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991 not found: ID does not exist" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.638180 4991 scope.go:117] "RemoveContainer" containerID="15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636" Oct 06 08:41:03 crc kubenswrapper[4991]: E1006 08:41:03.639533 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636\": container with ID starting with 15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636 not found: ID does not exist" containerID="15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.639572 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636"} err="failed to get container status \"15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636\": rpc error: code = NotFound desc = could not find container \"15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636\": container with ID starting with 15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636 not found: ID does not exist" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.639597 4991 scope.go:117] "RemoveContainer" containerID="01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.639614 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 08:41:03 crc kubenswrapper[4991]: E1006 08:41:03.640073 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec36c4e8-0d7b-4570-bb22-16367889063f" containerName="nova-manage" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.640090 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec36c4e8-0d7b-4570-bb22-16367889063f" containerName="nova-manage" Oct 06 08:41:03 crc kubenswrapper[4991]: E1006 08:41:03.640117 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8216a7-a91a-4b96-88f8-f12c836ba326" containerName="nova-metadata-metadata" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.640126 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8216a7-a91a-4b96-88f8-f12c836ba326" containerName="nova-metadata-metadata" Oct 06 08:41:03 crc kubenswrapper[4991]: E1006 08:41:03.640145 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f68ff49-be6e-460f-91e6-ec7d260e0aff" containerName="dnsmasq-dns" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.640155 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f68ff49-be6e-460f-91e6-ec7d260e0aff" containerName="dnsmasq-dns" Oct 06 08:41:03 crc kubenswrapper[4991]: E1006 08:41:03.640173 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea12773e-89f6-478d-92f5-23bfb4a05a6a" containerName="nova-cell1-conductor-db-sync" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.640182 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea12773e-89f6-478d-92f5-23bfb4a05a6a" containerName="nova-cell1-conductor-db-sync" Oct 06 08:41:03 crc kubenswrapper[4991]: E1006 08:41:03.640201 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f68ff49-be6e-460f-91e6-ec7d260e0aff" containerName="init" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.640209 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f68ff49-be6e-460f-91e6-ec7d260e0aff" containerName="init" Oct 06 08:41:03 crc kubenswrapper[4991]: E1006 08:41:03.640226 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8216a7-a91a-4b96-88f8-f12c836ba326" containerName="nova-metadata-log" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.640235 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8216a7-a91a-4b96-88f8-f12c836ba326" containerName="nova-metadata-log" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.640501 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea12773e-89f6-478d-92f5-23bfb4a05a6a" containerName="nova-cell1-conductor-db-sync" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.640527 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f68ff49-be6e-460f-91e6-ec7d260e0aff" containerName="dnsmasq-dns" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.640536 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf8216a7-a91a-4b96-88f8-f12c836ba326" containerName="nova-metadata-metadata" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.640574 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf8216a7-a91a-4b96-88f8-f12c836ba326" containerName="nova-metadata-log" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.640592 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec36c4e8-0d7b-4570-bb22-16367889063f" containerName="nova-manage" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.641391 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.642869 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991"} err="failed to get container status \"01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991\": rpc error: code = NotFound desc = could not find container \"01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991\": container with ID starting with 01d762e61ae9d289451651eb7beaf36d0d1bab5004cdfffa1fda9b970fe9d991 not found: ID does not exist" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.642910 4991 scope.go:117] "RemoveContainer" containerID="15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.643248 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636"} err="failed to get container status \"15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636\": rpc error: code = NotFound desc = could not find container \"15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636\": container with ID starting with 15b4dd019e63d7e16c0364826bfb109978eda4434c2f829c1de09cb7150af636 not found: ID does not exist" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.647346 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.650952 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.655166 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.661428 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.661822 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.662151 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.672725 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.744806 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.744859 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2vz5\" (UniqueName: \"kubernetes.io/projected/704c9be7-2e65-4018-9388-7f75e8f4dcd6-kube-api-access-t2vz5\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.744972 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-config-data\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.745039 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.745134 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b53689-326b-4f4c-a625-beec7a3631fa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c5b53689-326b-4f4c-a625-beec7a3631fa\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.745245 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/704c9be7-2e65-4018-9388-7f75e8f4dcd6-logs\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.745342 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfnrn\" (UniqueName: \"kubernetes.io/projected/c5b53689-326b-4f4c-a625-beec7a3631fa-kube-api-access-xfnrn\") pod \"nova-cell1-conductor-0\" (UID: \"c5b53689-326b-4f4c-a625-beec7a3631fa\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.745427 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b53689-326b-4f4c-a625-beec7a3631fa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c5b53689-326b-4f4c-a625-beec7a3631fa\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.846522 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-config-data\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.846562 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.846599 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b53689-326b-4f4c-a625-beec7a3631fa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c5b53689-326b-4f4c-a625-beec7a3631fa\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.846636 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/704c9be7-2e65-4018-9388-7f75e8f4dcd6-logs\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.846673 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfnrn\" (UniqueName: \"kubernetes.io/projected/c5b53689-326b-4f4c-a625-beec7a3631fa-kube-api-access-xfnrn\") pod \"nova-cell1-conductor-0\" (UID: \"c5b53689-326b-4f4c-a625-beec7a3631fa\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.846710 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b53689-326b-4f4c-a625-beec7a3631fa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c5b53689-326b-4f4c-a625-beec7a3631fa\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.846758 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.846776 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2vz5\" (UniqueName: \"kubernetes.io/projected/704c9be7-2e65-4018-9388-7f75e8f4dcd6-kube-api-access-t2vz5\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.847790 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/704c9be7-2e65-4018-9388-7f75e8f4dcd6-logs\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.850210 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-config-data\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.851210 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.851533 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b53689-326b-4f4c-a625-beec7a3631fa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c5b53689-326b-4f4c-a625-beec7a3631fa\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.855086 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.855816 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b53689-326b-4f4c-a625-beec7a3631fa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c5b53689-326b-4f4c-a625-beec7a3631fa\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.865835 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfnrn\" (UniqueName: \"kubernetes.io/projected/c5b53689-326b-4f4c-a625-beec7a3631fa-kube-api-access-xfnrn\") pod \"nova-cell1-conductor-0\" (UID: \"c5b53689-326b-4f4c-a625-beec7a3631fa\") " pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.867616 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2vz5\" (UniqueName: \"kubernetes.io/projected/704c9be7-2e65-4018-9388-7f75e8f4dcd6-kube-api-access-t2vz5\") pod \"nova-metadata-0\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " pod="openstack/nova-metadata-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.975065 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:03 crc kubenswrapper[4991]: I1006 08:41:03.997885 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:41:04 crc kubenswrapper[4991]: I1006 08:41:04.460861 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 08:41:04 crc kubenswrapper[4991]: I1006 08:41:04.477261 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:04 crc kubenswrapper[4991]: I1006 08:41:04.572362 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"704c9be7-2e65-4018-9388-7f75e8f4dcd6","Type":"ContainerStarted","Data":"53782fc04dbb7a90a25c9515e5478fb1738c707fc10f55c2b7c8aef803e82a1c"} Oct 06 08:41:04 crc kubenswrapper[4991]: I1006 08:41:04.574457 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c5b53689-326b-4f4c-a625-beec7a3631fa","Type":"ContainerStarted","Data":"be2c98bbc96939e542bf23df0e5cf44de8fa28b270045204096113d0a382a945"} Oct 06 08:41:04 crc kubenswrapper[4991]: I1006 08:41:04.575847 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="50e0cec2-978e-4758-9881-c2ae6db0b8c5" containerName="nova-scheduler-scheduler" containerID="cri-o://ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b" gracePeriod=30 Oct 06 08:41:05 crc kubenswrapper[4991]: I1006 08:41:05.255724 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf8216a7-a91a-4b96-88f8-f12c836ba326" path="/var/lib/kubelet/pods/bf8216a7-a91a-4b96-88f8-f12c836ba326/volumes" Oct 06 08:41:05 crc kubenswrapper[4991]: I1006 08:41:05.341121 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 08:41:05 crc kubenswrapper[4991]: I1006 08:41:05.604136 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c5b53689-326b-4f4c-a625-beec7a3631fa","Type":"ContainerStarted","Data":"05fd087fc4e56815232a45eddb3364d72ba9e9e329ba6d624cee180ef68e0693"} Oct 06 08:41:05 crc kubenswrapper[4991]: I1006 08:41:05.604756 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:05 crc kubenswrapper[4991]: I1006 08:41:05.613485 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"704c9be7-2e65-4018-9388-7f75e8f4dcd6","Type":"ContainerStarted","Data":"8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827"} Oct 06 08:41:05 crc kubenswrapper[4991]: I1006 08:41:05.613546 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"704c9be7-2e65-4018-9388-7f75e8f4dcd6","Type":"ContainerStarted","Data":"180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23"} Oct 06 08:41:05 crc kubenswrapper[4991]: I1006 08:41:05.633346 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.63332606 podStartE2EDuration="2.63332606s" podCreationTimestamp="2025-10-06 08:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:05.621092227 +0000 UTC m=+1317.358842268" watchObservedRunningTime="2025-10-06 08:41:05.63332606 +0000 UTC m=+1317.371076081" Oct 06 08:41:05 crc kubenswrapper[4991]: I1006 08:41:05.652387 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.652368644 podStartE2EDuration="2.652368644s" podCreationTimestamp="2025-10-06 08:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:05.645206813 +0000 UTC m=+1317.382956844" watchObservedRunningTime="2025-10-06 08:41:05.652368644 +0000 UTC m=+1317.390118665" Oct 06 08:41:05 crc kubenswrapper[4991]: I1006 08:41:05.989586 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.100438 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnfw5\" (UniqueName: \"kubernetes.io/projected/50e0cec2-978e-4758-9881-c2ae6db0b8c5-kube-api-access-rnfw5\") pod \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\" (UID: \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\") " Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.100483 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50e0cec2-978e-4758-9881-c2ae6db0b8c5-config-data\") pod \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\" (UID: \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\") " Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.100653 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50e0cec2-978e-4758-9881-c2ae6db0b8c5-combined-ca-bundle\") pod \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\" (UID: \"50e0cec2-978e-4758-9881-c2ae6db0b8c5\") " Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.111934 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e0cec2-978e-4758-9881-c2ae6db0b8c5-kube-api-access-rnfw5" (OuterVolumeSpecName: "kube-api-access-rnfw5") pod "50e0cec2-978e-4758-9881-c2ae6db0b8c5" (UID: "50e0cec2-978e-4758-9881-c2ae6db0b8c5"). InnerVolumeSpecName "kube-api-access-rnfw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.130926 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50e0cec2-978e-4758-9881-c2ae6db0b8c5-config-data" (OuterVolumeSpecName: "config-data") pod "50e0cec2-978e-4758-9881-c2ae6db0b8c5" (UID: "50e0cec2-978e-4758-9881-c2ae6db0b8c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.142754 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50e0cec2-978e-4758-9881-c2ae6db0b8c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50e0cec2-978e-4758-9881-c2ae6db0b8c5" (UID: "50e0cec2-978e-4758-9881-c2ae6db0b8c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.202391 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50e0cec2-978e-4758-9881-c2ae6db0b8c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.202438 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnfw5\" (UniqueName: \"kubernetes.io/projected/50e0cec2-978e-4758-9881-c2ae6db0b8c5-kube-api-access-rnfw5\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.202450 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50e0cec2-978e-4758-9881-c2ae6db0b8c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.634834 4991 generic.go:334] "Generic (PLEG): container finished" podID="50e0cec2-978e-4758-9881-c2ae6db0b8c5" containerID="ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b" exitCode=0 Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.635063 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50e0cec2-978e-4758-9881-c2ae6db0b8c5","Type":"ContainerDied","Data":"ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b"} Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.636535 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50e0cec2-978e-4758-9881-c2ae6db0b8c5","Type":"ContainerDied","Data":"3b0ac81739f0ba38179dc4646c80ace753732c50d96fcba09a8014c4c088d9a1"} Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.636573 4991 scope.go:117] "RemoveContainer" containerID="ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.635207 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.674946 4991 scope.go:117] "RemoveContainer" containerID="ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b" Oct 06 08:41:06 crc kubenswrapper[4991]: E1006 08:41:06.675838 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b\": container with ID starting with ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b not found: ID does not exist" containerID="ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.675878 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b"} err="failed to get container status \"ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b\": rpc error: code = NotFound desc = could not find container \"ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b\": container with ID starting with ee93c47bcd58193c04fb168243ffbdd6ce203c5d5d79710c12bb8f9a9937583b not found: ID does not exist" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.701049 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.712942 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.719907 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:06 crc kubenswrapper[4991]: E1006 08:41:06.720398 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e0cec2-978e-4758-9881-c2ae6db0b8c5" containerName="nova-scheduler-scheduler" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.720420 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e0cec2-978e-4758-9881-c2ae6db0b8c5" containerName="nova-scheduler-scheduler" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.720611 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e0cec2-978e-4758-9881-c2ae6db0b8c5" containerName="nova-scheduler-scheduler" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.721228 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.727441 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.737698 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.813271 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r67b\" (UniqueName: \"kubernetes.io/projected/f59c5f21-de13-43af-94e4-a2fc82169e33-kube-api-access-6r67b\") pod \"nova-scheduler-0\" (UID: \"f59c5f21-de13-43af-94e4-a2fc82169e33\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.813379 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f59c5f21-de13-43af-94e4-a2fc82169e33-config-data\") pod \"nova-scheduler-0\" (UID: \"f59c5f21-de13-43af-94e4-a2fc82169e33\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.813828 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59c5f21-de13-43af-94e4-a2fc82169e33-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f59c5f21-de13-43af-94e4-a2fc82169e33\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.915601 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f59c5f21-de13-43af-94e4-a2fc82169e33-config-data\") pod \"nova-scheduler-0\" (UID: \"f59c5f21-de13-43af-94e4-a2fc82169e33\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.915709 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59c5f21-de13-43af-94e4-a2fc82169e33-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f59c5f21-de13-43af-94e4-a2fc82169e33\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.915778 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r67b\" (UniqueName: \"kubernetes.io/projected/f59c5f21-de13-43af-94e4-a2fc82169e33-kube-api-access-6r67b\") pod \"nova-scheduler-0\" (UID: \"f59c5f21-de13-43af-94e4-a2fc82169e33\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.919376 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59c5f21-de13-43af-94e4-a2fc82169e33-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f59c5f21-de13-43af-94e4-a2fc82169e33\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.919642 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f59c5f21-de13-43af-94e4-a2fc82169e33-config-data\") pod \"nova-scheduler-0\" (UID: \"f59c5f21-de13-43af-94e4-a2fc82169e33\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:06 crc kubenswrapper[4991]: I1006 08:41:06.933011 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r67b\" (UniqueName: \"kubernetes.io/projected/f59c5f21-de13-43af-94e4-a2fc82169e33-kube-api-access-6r67b\") pod \"nova-scheduler-0\" (UID: \"f59c5f21-de13-43af-94e4-a2fc82169e33\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:07 crc kubenswrapper[4991]: I1006 08:41:07.039016 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:41:07 crc kubenswrapper[4991]: I1006 08:41:07.259530 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e0cec2-978e-4758-9881-c2ae6db0b8c5" path="/var/lib/kubelet/pods/50e0cec2-978e-4758-9881-c2ae6db0b8c5/volumes" Oct 06 08:41:07 crc kubenswrapper[4991]: I1006 08:41:07.509223 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:07 crc kubenswrapper[4991]: W1006 08:41:07.516474 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf59c5f21_de13_43af_94e4_a2fc82169e33.slice/crio-14eb8b2dda77203f79419551ebed433d2eea4ed5f0974ae306dad0fc65b819d9 WatchSource:0}: Error finding container 14eb8b2dda77203f79419551ebed433d2eea4ed5f0974ae306dad0fc65b819d9: Status 404 returned error can't find the container with id 14eb8b2dda77203f79419551ebed433d2eea4ed5f0974ae306dad0fc65b819d9 Oct 06 08:41:07 crc kubenswrapper[4991]: I1006 08:41:07.650995 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f59c5f21-de13-43af-94e4-a2fc82169e33","Type":"ContainerStarted","Data":"14eb8b2dda77203f79419551ebed433d2eea4ed5f0974ae306dad0fc65b819d9"} Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.668144 4991 generic.go:334] "Generic (PLEG): container finished" podID="c71fc75d-0a11-4673-a14d-90f3269ff26f" containerID="5b868b9187832c2005be815975b42773e519e81fb95f561cfb8a51e94477be13" exitCode=0 Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.668235 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c71fc75d-0a11-4673-a14d-90f3269ff26f","Type":"ContainerDied","Data":"5b868b9187832c2005be815975b42773e519e81fb95f561cfb8a51e94477be13"} Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.668597 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c71fc75d-0a11-4673-a14d-90f3269ff26f","Type":"ContainerDied","Data":"cfe21ba219347645960b422708293d79554a8bd271b712cf6fad6a029f3aa597"} Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.668616 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfe21ba219347645960b422708293d79554a8bd271b712cf6fad6a029f3aa597" Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.671658 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f59c5f21-de13-43af-94e4-a2fc82169e33","Type":"ContainerStarted","Data":"5f56329edbf132723d4798cb7d65ed87d6644c63faa775d835afbc6a4ee41572"} Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.679078 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.698397 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6983831179999997 podStartE2EDuration="2.698383118s" podCreationTimestamp="2025-10-06 08:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:08.692615606 +0000 UTC m=+1320.430365627" watchObservedRunningTime="2025-10-06 08:41:08.698383118 +0000 UTC m=+1320.436133129" Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.753059 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71fc75d-0a11-4673-a14d-90f3269ff26f-logs\") pod \"c71fc75d-0a11-4673-a14d-90f3269ff26f\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.753916 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzt6d\" (UniqueName: \"kubernetes.io/projected/c71fc75d-0a11-4673-a14d-90f3269ff26f-kube-api-access-kzt6d\") pod \"c71fc75d-0a11-4673-a14d-90f3269ff26f\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.753633 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c71fc75d-0a11-4673-a14d-90f3269ff26f-logs" (OuterVolumeSpecName: "logs") pod "c71fc75d-0a11-4673-a14d-90f3269ff26f" (UID: "c71fc75d-0a11-4673-a14d-90f3269ff26f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.754146 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71fc75d-0a11-4673-a14d-90f3269ff26f-combined-ca-bundle\") pod \"c71fc75d-0a11-4673-a14d-90f3269ff26f\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.754254 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71fc75d-0a11-4673-a14d-90f3269ff26f-config-data\") pod \"c71fc75d-0a11-4673-a14d-90f3269ff26f\" (UID: \"c71fc75d-0a11-4673-a14d-90f3269ff26f\") " Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.761107 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c71fc75d-0a11-4673-a14d-90f3269ff26f-kube-api-access-kzt6d" (OuterVolumeSpecName: "kube-api-access-kzt6d") pod "c71fc75d-0a11-4673-a14d-90f3269ff26f" (UID: "c71fc75d-0a11-4673-a14d-90f3269ff26f"). InnerVolumeSpecName "kube-api-access-kzt6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.762780 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzt6d\" (UniqueName: \"kubernetes.io/projected/c71fc75d-0a11-4673-a14d-90f3269ff26f-kube-api-access-kzt6d\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.762801 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c71fc75d-0a11-4673-a14d-90f3269ff26f-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.792110 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71fc75d-0a11-4673-a14d-90f3269ff26f-config-data" (OuterVolumeSpecName: "config-data") pod "c71fc75d-0a11-4673-a14d-90f3269ff26f" (UID: "c71fc75d-0a11-4673-a14d-90f3269ff26f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.792491 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c71fc75d-0a11-4673-a14d-90f3269ff26f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c71fc75d-0a11-4673-a14d-90f3269ff26f" (UID: "c71fc75d-0a11-4673-a14d-90f3269ff26f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.864924 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c71fc75d-0a11-4673-a14d-90f3269ff26f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.864964 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c71fc75d-0a11-4673-a14d-90f3269ff26f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.999051 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 08:41:08 crc kubenswrapper[4991]: I1006 08:41:08.999099 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.097642 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.097840 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="849ecb33-22bf-43e5-ac4c-9a3e5cc0c668" containerName="kube-state-metrics" containerID="cri-o://280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca" gracePeriod=30 Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.617223 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.680836 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5gkk\" (UniqueName: \"kubernetes.io/projected/849ecb33-22bf-43e5-ac4c-9a3e5cc0c668-kube-api-access-p5gkk\") pod \"849ecb33-22bf-43e5-ac4c-9a3e5cc0c668\" (UID: \"849ecb33-22bf-43e5-ac4c-9a3e5cc0c668\") " Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.687128 4991 generic.go:334] "Generic (PLEG): container finished" podID="849ecb33-22bf-43e5-ac4c-9a3e5cc0c668" containerID="280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca" exitCode=2 Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.687542 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.688160 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"849ecb33-22bf-43e5-ac4c-9a3e5cc0c668","Type":"ContainerDied","Data":"280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca"} Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.688197 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"849ecb33-22bf-43e5-ac4c-9a3e5cc0c668","Type":"ContainerDied","Data":"c6ba945d618e0b9e5bedd388788e1f9ef676b3fab13c3fa86a94ca5c6129ab4e"} Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.688217 4991 scope.go:117] "RemoveContainer" containerID="280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.688379 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.690440 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849ecb33-22bf-43e5-ac4c-9a3e5cc0c668-kube-api-access-p5gkk" (OuterVolumeSpecName: "kube-api-access-p5gkk") pod "849ecb33-22bf-43e5-ac4c-9a3e5cc0c668" (UID: "849ecb33-22bf-43e5-ac4c-9a3e5cc0c668"). InnerVolumeSpecName "kube-api-access-p5gkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.752212 4991 scope.go:117] "RemoveContainer" containerID="280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca" Oct 06 08:41:09 crc kubenswrapper[4991]: E1006 08:41:09.756030 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca\": container with ID starting with 280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca not found: ID does not exist" containerID="280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.756108 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca"} err="failed to get container status \"280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca\": rpc error: code = NotFound desc = could not find container \"280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca\": container with ID starting with 280a23bf66262369a6dfbd87dd805cb72c701cac99d5cb0d159a33cb2b3a83ca not found: ID does not exist" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.765704 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.774482 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.781977 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:09 crc kubenswrapper[4991]: E1006 08:41:09.782357 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849ecb33-22bf-43e5-ac4c-9a3e5cc0c668" containerName="kube-state-metrics" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.782374 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="849ecb33-22bf-43e5-ac4c-9a3e5cc0c668" containerName="kube-state-metrics" Oct 06 08:41:09 crc kubenswrapper[4991]: E1006 08:41:09.782398 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71fc75d-0a11-4673-a14d-90f3269ff26f" containerName="nova-api-log" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.782405 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71fc75d-0a11-4673-a14d-90f3269ff26f" containerName="nova-api-log" Oct 06 08:41:09 crc kubenswrapper[4991]: E1006 08:41:09.782436 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71fc75d-0a11-4673-a14d-90f3269ff26f" containerName="nova-api-api" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.782442 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71fc75d-0a11-4673-a14d-90f3269ff26f" containerName="nova-api-api" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.782620 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="849ecb33-22bf-43e5-ac4c-9a3e5cc0c668" containerName="kube-state-metrics" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.782639 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71fc75d-0a11-4673-a14d-90f3269ff26f" containerName="nova-api-log" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.782657 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71fc75d-0a11-4673-a14d-90f3269ff26f" containerName="nova-api-api" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.783519 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5gkk\" (UniqueName: \"kubernetes.io/projected/849ecb33-22bf-43e5-ac4c-9a3e5cc0c668-kube-api-access-p5gkk\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.783555 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.786616 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.798629 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.885599 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d242e246-427c-43d9-992e-9175bb2ac3d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " pod="openstack/nova-api-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.885696 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d242e246-427c-43d9-992e-9175bb2ac3d9-logs\") pod \"nova-api-0\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " pod="openstack/nova-api-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.885739 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq489\" (UniqueName: \"kubernetes.io/projected/d242e246-427c-43d9-992e-9175bb2ac3d9-kube-api-access-wq489\") pod \"nova-api-0\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " pod="openstack/nova-api-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.885800 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d242e246-427c-43d9-992e-9175bb2ac3d9-config-data\") pod \"nova-api-0\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " pod="openstack/nova-api-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.987936 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d242e246-427c-43d9-992e-9175bb2ac3d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " pod="openstack/nova-api-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.988012 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d242e246-427c-43d9-992e-9175bb2ac3d9-logs\") pod \"nova-api-0\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " pod="openstack/nova-api-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.988041 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq489\" (UniqueName: \"kubernetes.io/projected/d242e246-427c-43d9-992e-9175bb2ac3d9-kube-api-access-wq489\") pod \"nova-api-0\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " pod="openstack/nova-api-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.988092 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d242e246-427c-43d9-992e-9175bb2ac3d9-config-data\") pod \"nova-api-0\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " pod="openstack/nova-api-0" Oct 06 08:41:09 crc kubenswrapper[4991]: I1006 08:41:09.989853 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d242e246-427c-43d9-992e-9175bb2ac3d9-logs\") pod \"nova-api-0\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " pod="openstack/nova-api-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.000992 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d242e246-427c-43d9-992e-9175bb2ac3d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " pod="openstack/nova-api-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.001122 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d242e246-427c-43d9-992e-9175bb2ac3d9-config-data\") pod \"nova-api-0\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " pod="openstack/nova-api-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.007549 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq489\" (UniqueName: \"kubernetes.io/projected/d242e246-427c-43d9-992e-9175bb2ac3d9-kube-api-access-wq489\") pod \"nova-api-0\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " pod="openstack/nova-api-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.112816 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.141108 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.162908 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.172426 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.173943 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.176702 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.178439 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.185271 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.294019 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.294097 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.294124 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.294153 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vg9r\" (UniqueName: \"kubernetes.io/projected/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-api-access-4vg9r\") pod \"kube-state-metrics-0\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.398711 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.398887 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.399082 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.399143 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vg9r\" (UniqueName: \"kubernetes.io/projected/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-api-access-4vg9r\") pod \"kube-state-metrics-0\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.405390 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.405490 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.405824 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.432338 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vg9r\" (UniqueName: \"kubernetes.io/projected/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-api-access-4vg9r\") pod \"kube-state-metrics-0\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.550401 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.573061 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.697306 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d242e246-427c-43d9-992e-9175bb2ac3d9","Type":"ContainerStarted","Data":"4c00256b5a84f9818798f21399fba391c32511c711669cf6b391121e3d593d18"} Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.958735 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.959466 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="ceilometer-central-agent" containerID="cri-o://76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4" gracePeriod=30 Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.959539 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="sg-core" containerID="cri-o://a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9" gracePeriod=30 Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.959495 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="proxy-httpd" containerID="cri-o://d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d" gracePeriod=30 Oct 06 08:41:10 crc kubenswrapper[4991]: I1006 08:41:10.959544 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="ceilometer-notification-agent" containerID="cri-o://20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208" gracePeriod=30 Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.012071 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.262224 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849ecb33-22bf-43e5-ac4c-9a3e5cc0c668" path="/var/lib/kubelet/pods/849ecb33-22bf-43e5-ac4c-9a3e5cc0c668/volumes" Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.262999 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c71fc75d-0a11-4673-a14d-90f3269ff26f" path="/var/lib/kubelet/pods/c71fc75d-0a11-4673-a14d-90f3269ff26f/volumes" Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.716219 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fded3e15-f946-4f86-bed4-2c4a3262395a","Type":"ContainerStarted","Data":"2b0d0844a91b8badda9de344d7ec23d9d43da43f008821a4ea8b7ae982ffc991"} Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.717438 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fded3e15-f946-4f86-bed4-2c4a3262395a","Type":"ContainerStarted","Data":"dedb648723c43b1425e3d19a451a68998cd390e07ca52163680413cc97987dad"} Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.718430 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.722107 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d242e246-427c-43d9-992e-9175bb2ac3d9","Type":"ContainerStarted","Data":"4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2"} Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.722218 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d242e246-427c-43d9-992e-9175bb2ac3d9","Type":"ContainerStarted","Data":"0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9"} Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.727735 4991 generic.go:334] "Generic (PLEG): container finished" podID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerID="d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d" exitCode=0 Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.727827 4991 generic.go:334] "Generic (PLEG): container finished" podID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerID="a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9" exitCode=2 Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.727884 4991 generic.go:334] "Generic (PLEG): container finished" podID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerID="76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4" exitCode=0 Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.727964 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c","Type":"ContainerDied","Data":"d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d"} Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.728036 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c","Type":"ContainerDied","Data":"a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9"} Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.728096 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c","Type":"ContainerDied","Data":"76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4"} Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.753912 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.378255614 podStartE2EDuration="1.753889768s" podCreationTimestamp="2025-10-06 08:41:10 +0000 UTC" firstStartedPulling="2025-10-06 08:41:11.025349907 +0000 UTC m=+1322.763099928" lastFinishedPulling="2025-10-06 08:41:11.400984061 +0000 UTC m=+1323.138734082" observedRunningTime="2025-10-06 08:41:11.747128378 +0000 UTC m=+1323.484878439" watchObservedRunningTime="2025-10-06 08:41:11.753889768 +0000 UTC m=+1323.491639809" Oct 06 08:41:11 crc kubenswrapper[4991]: I1006 08:41:11.773244 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.773220478 podStartE2EDuration="2.773220478s" podCreationTimestamp="2025-10-06 08:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:11.769073913 +0000 UTC m=+1323.506823934" watchObservedRunningTime="2025-10-06 08:41:11.773220478 +0000 UTC m=+1323.510970509" Oct 06 08:41:12 crc kubenswrapper[4991]: I1006 08:41:12.039984 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.454837 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.563841 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-log-httpd\") pod \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.563955 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-combined-ca-bundle\") pod \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.564133 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zx4t\" (UniqueName: \"kubernetes.io/projected/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-kube-api-access-5zx4t\") pod \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.564489 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" (UID: "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.564895 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-config-data\") pod \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.564953 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-run-httpd\") pod \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.564995 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-sg-core-conf-yaml\") pod \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.565058 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-scripts\") pod \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\" (UID: \"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c\") " Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.565283 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" (UID: "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.565792 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.565823 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.580874 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-kube-api-access-5zx4t" (OuterVolumeSpecName: "kube-api-access-5zx4t") pod "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" (UID: "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c"). InnerVolumeSpecName "kube-api-access-5zx4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.582317 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-scripts" (OuterVolumeSpecName: "scripts") pod "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" (UID: "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.598831 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" (UID: "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.654419 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" (UID: "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.667060 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.667093 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zx4t\" (UniqueName: \"kubernetes.io/projected/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-kube-api-access-5zx4t\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.667109 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.667122 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.668069 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-config-data" (OuterVolumeSpecName: "config-data") pod "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" (UID: "8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.752424 4991 generic.go:334] "Generic (PLEG): container finished" podID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerID="20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208" exitCode=0 Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.752542 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.752589 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c","Type":"ContainerDied","Data":"20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208"} Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.752629 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c","Type":"ContainerDied","Data":"163a665e549e12f2fe3cff3b22897caf0563180273e0b9cbdd9676b07530f94b"} Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.752657 4991 scope.go:117] "RemoveContainer" containerID="d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.770207 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.786950 4991 scope.go:117] "RemoveContainer" containerID="a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.795226 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.815728 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.819582 4991 scope.go:117] "RemoveContainer" containerID="20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.830225 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:13 crc kubenswrapper[4991]: E1006 08:41:13.830687 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="ceilometer-central-agent" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.830715 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="ceilometer-central-agent" Oct 06 08:41:13 crc kubenswrapper[4991]: E1006 08:41:13.830733 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="proxy-httpd" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.830742 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="proxy-httpd" Oct 06 08:41:13 crc kubenswrapper[4991]: E1006 08:41:13.830775 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="sg-core" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.830784 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="sg-core" Oct 06 08:41:13 crc kubenswrapper[4991]: E1006 08:41:13.830817 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="ceilometer-notification-agent" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.830824 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="ceilometer-notification-agent" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.831028 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="sg-core" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.831048 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="ceilometer-notification-agent" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.831068 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="proxy-httpd" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.831082 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" containerName="ceilometer-central-agent" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.833159 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.835280 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.835348 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.835658 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.846279 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.858031 4991 scope.go:117] "RemoveContainer" containerID="76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.888964 4991 scope.go:117] "RemoveContainer" containerID="d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d" Oct 06 08:41:13 crc kubenswrapper[4991]: E1006 08:41:13.889590 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d\": container with ID starting with d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d not found: ID does not exist" containerID="d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.889621 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d"} err="failed to get container status \"d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d\": rpc error: code = NotFound desc = could not find container \"d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d\": container with ID starting with d91a6a3f97c85227c6c8158d12bc1d252eec5afc6e9e6b1002a81633b947795d not found: ID does not exist" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.889642 4991 scope.go:117] "RemoveContainer" containerID="a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9" Oct 06 08:41:13 crc kubenswrapper[4991]: E1006 08:41:13.889913 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9\": container with ID starting with a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9 not found: ID does not exist" containerID="a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.889938 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9"} err="failed to get container status \"a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9\": rpc error: code = NotFound desc = could not find container \"a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9\": container with ID starting with a49a3c5d2e9373ef620d7e74352d2a2894010f4d9feb42d528fe9066944d04e9 not found: ID does not exist" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.889950 4991 scope.go:117] "RemoveContainer" containerID="20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208" Oct 06 08:41:13 crc kubenswrapper[4991]: E1006 08:41:13.890181 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208\": container with ID starting with 20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208 not found: ID does not exist" containerID="20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.890209 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208"} err="failed to get container status \"20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208\": rpc error: code = NotFound desc = could not find container \"20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208\": container with ID starting with 20650d535a4df593111176e35f4c5a92df8e886abef0f1f7ceaf09bdf9e42208 not found: ID does not exist" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.890227 4991 scope.go:117] "RemoveContainer" containerID="76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4" Oct 06 08:41:13 crc kubenswrapper[4991]: E1006 08:41:13.890493 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4\": container with ID starting with 76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4 not found: ID does not exist" containerID="76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.890518 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4"} err="failed to get container status \"76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4\": rpc error: code = NotFound desc = could not find container \"76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4\": container with ID starting with 76dfef401531c47982e5d514537f3a61d539f4fff3d57210c92acb4015890be4 not found: ID does not exist" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.973907 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0414db4-b3f4-4864-87a1-b84ab999cf9b-run-httpd\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.974076 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0414db4-b3f4-4864-87a1-b84ab999cf9b-log-httpd\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.974124 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-scripts\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.974200 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-config-data\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.974271 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.974337 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.974421 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.974558 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgxg7\" (UniqueName: \"kubernetes.io/projected/c0414db4-b3f4-4864-87a1-b84ab999cf9b-kube-api-access-rgxg7\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.999490 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 08:41:13 crc kubenswrapper[4991]: I1006 08:41:13.999533 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.001878 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.076449 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0414db4-b3f4-4864-87a1-b84ab999cf9b-run-httpd\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.077074 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0414db4-b3f4-4864-87a1-b84ab999cf9b-log-httpd\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.077118 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-scripts\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.077154 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-config-data\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.077244 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.077273 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.077313 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.077365 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgxg7\" (UniqueName: \"kubernetes.io/projected/c0414db4-b3f4-4864-87a1-b84ab999cf9b-kube-api-access-rgxg7\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.076916 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0414db4-b3f4-4864-87a1-b84ab999cf9b-run-httpd\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.078986 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0414db4-b3f4-4864-87a1-b84ab999cf9b-log-httpd\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.083936 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-scripts\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.085756 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.086078 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-config-data\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.089904 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.090208 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.100669 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgxg7\" (UniqueName: \"kubernetes.io/projected/c0414db4-b3f4-4864-87a1-b84ab999cf9b-kube-api-access-rgxg7\") pod \"ceilometer-0\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.164097 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.663328 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:14 crc kubenswrapper[4991]: I1006 08:41:14.762204 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0414db4-b3f4-4864-87a1-b84ab999cf9b","Type":"ContainerStarted","Data":"90f1d13bcc0e0c9c30c21806566067aad6bd00112e618f9aa74fc2f3c3bfe650"} Oct 06 08:41:15 crc kubenswrapper[4991]: I1006 08:41:15.011604 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:15 crc kubenswrapper[4991]: I1006 08:41:15.011613 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:15 crc kubenswrapper[4991]: I1006 08:41:15.259200 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c" path="/var/lib/kubelet/pods/8734a1c0-e8fd-46bc-90d4-6a7edcce1e2c/volumes" Oct 06 08:41:15 crc kubenswrapper[4991]: I1006 08:41:15.772882 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0414db4-b3f4-4864-87a1-b84ab999cf9b","Type":"ContainerStarted","Data":"821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b"} Oct 06 08:41:16 crc kubenswrapper[4991]: I1006 08:41:16.787243 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0414db4-b3f4-4864-87a1-b84ab999cf9b","Type":"ContainerStarted","Data":"c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888"} Oct 06 08:41:17 crc kubenswrapper[4991]: I1006 08:41:17.039497 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 08:41:17 crc kubenswrapper[4991]: I1006 08:41:17.073822 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 08:41:17 crc kubenswrapper[4991]: I1006 08:41:17.803355 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0414db4-b3f4-4864-87a1-b84ab999cf9b","Type":"ContainerStarted","Data":"f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d"} Oct 06 08:41:17 crc kubenswrapper[4991]: I1006 08:41:17.840728 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 08:41:18 crc kubenswrapper[4991]: I1006 08:41:18.813673 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0414db4-b3f4-4864-87a1-b84ab999cf9b","Type":"ContainerStarted","Data":"5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c"} Oct 06 08:41:18 crc kubenswrapper[4991]: I1006 08:41:18.814210 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:41:18 crc kubenswrapper[4991]: I1006 08:41:18.837348 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.303892009 podStartE2EDuration="5.837326526s" podCreationTimestamp="2025-10-06 08:41:13 +0000 UTC" firstStartedPulling="2025-10-06 08:41:14.663228787 +0000 UTC m=+1326.400978808" lastFinishedPulling="2025-10-06 08:41:18.196663294 +0000 UTC m=+1329.934413325" observedRunningTime="2025-10-06 08:41:18.835446993 +0000 UTC m=+1330.573197034" watchObservedRunningTime="2025-10-06 08:41:18.837326526 +0000 UTC m=+1330.575076557" Oct 06 08:41:20 crc kubenswrapper[4991]: I1006 08:41:20.114382 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 08:41:20 crc kubenswrapper[4991]: I1006 08:41:20.114475 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 08:41:20 crc kubenswrapper[4991]: I1006 08:41:20.558980 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 08:41:21 crc kubenswrapper[4991]: I1006 08:41:21.198547 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d242e246-427c-43d9-992e-9175bb2ac3d9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:21 crc kubenswrapper[4991]: I1006 08:41:21.198608 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d242e246-427c-43d9-992e-9175bb2ac3d9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:24 crc kubenswrapper[4991]: I1006 08:41:24.006348 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 08:41:24 crc kubenswrapper[4991]: I1006 08:41:24.007848 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 08:41:24 crc kubenswrapper[4991]: I1006 08:41:24.013851 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 08:41:24 crc kubenswrapper[4991]: I1006 08:41:24.901715 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 08:41:26 crc kubenswrapper[4991]: E1006 08:41:26.779566 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc71fc75d_0a11_4673_a14d_90f3269ff26f.slice/crio-cfe21ba219347645960b422708293d79554a8bd271b712cf6fad6a029f3aa597\": RecentStats: unable to find data in memory cache]" Oct 06 08:41:26 crc kubenswrapper[4991]: I1006 08:41:26.913851 4991 generic.go:334] "Generic (PLEG): container finished" podID="ebecc338-537f-4b31-b992-9cc46c89ea19" containerID="afaad24da82e9977eb0954a81eb93a35ce855f528655c94b4ae6d47f4f212c3d" exitCode=137 Oct 06 08:41:26 crc kubenswrapper[4991]: I1006 08:41:26.913915 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ebecc338-537f-4b31-b992-9cc46c89ea19","Type":"ContainerDied","Data":"afaad24da82e9977eb0954a81eb93a35ce855f528655c94b4ae6d47f4f212c3d"} Oct 06 08:41:26 crc kubenswrapper[4991]: I1006 08:41:26.913976 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ebecc338-537f-4b31-b992-9cc46c89ea19","Type":"ContainerDied","Data":"199f23db57a5cfa0b7e10ef818bfa256ec4fcb89ff9c4a0c2fb11fca82dba413"} Oct 06 08:41:26 crc kubenswrapper[4991]: I1006 08:41:26.913996 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="199f23db57a5cfa0b7e10ef818bfa256ec4fcb89ff9c4a0c2fb11fca82dba413" Oct 06 08:41:26 crc kubenswrapper[4991]: I1006 08:41:26.958031 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.091494 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebecc338-537f-4b31-b992-9cc46c89ea19-combined-ca-bundle\") pod \"ebecc338-537f-4b31-b992-9cc46c89ea19\" (UID: \"ebecc338-537f-4b31-b992-9cc46c89ea19\") " Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.092571 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebecc338-537f-4b31-b992-9cc46c89ea19-config-data\") pod \"ebecc338-537f-4b31-b992-9cc46c89ea19\" (UID: \"ebecc338-537f-4b31-b992-9cc46c89ea19\") " Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.092832 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh55j\" (UniqueName: \"kubernetes.io/projected/ebecc338-537f-4b31-b992-9cc46c89ea19-kube-api-access-gh55j\") pod \"ebecc338-537f-4b31-b992-9cc46c89ea19\" (UID: \"ebecc338-537f-4b31-b992-9cc46c89ea19\") " Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.103235 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebecc338-537f-4b31-b992-9cc46c89ea19-kube-api-access-gh55j" (OuterVolumeSpecName: "kube-api-access-gh55j") pod "ebecc338-537f-4b31-b992-9cc46c89ea19" (UID: "ebecc338-537f-4b31-b992-9cc46c89ea19"). InnerVolumeSpecName "kube-api-access-gh55j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.128591 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebecc338-537f-4b31-b992-9cc46c89ea19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebecc338-537f-4b31-b992-9cc46c89ea19" (UID: "ebecc338-537f-4b31-b992-9cc46c89ea19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.148151 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebecc338-537f-4b31-b992-9cc46c89ea19-config-data" (OuterVolumeSpecName: "config-data") pod "ebecc338-537f-4b31-b992-9cc46c89ea19" (UID: "ebecc338-537f-4b31-b992-9cc46c89ea19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.195886 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebecc338-537f-4b31-b992-9cc46c89ea19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.195917 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebecc338-537f-4b31-b992-9cc46c89ea19-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.195926 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh55j\" (UniqueName: \"kubernetes.io/projected/ebecc338-537f-4b31-b992-9cc46c89ea19-kube-api-access-gh55j\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.928813 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.962752 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.983449 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.996225 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:41:27 crc kubenswrapper[4991]: E1006 08:41:27.997007 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebecc338-537f-4b31-b992-9cc46c89ea19" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.997043 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebecc338-537f-4b31-b992-9cc46c89ea19" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.997416 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebecc338-537f-4b31-b992-9cc46c89ea19" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 08:41:27 crc kubenswrapper[4991]: I1006 08:41:27.998503 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.003696 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.003957 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.006172 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.007440 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.117426 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lszfm\" (UniqueName: \"kubernetes.io/projected/f4175b5d-7866-481a-a923-1ae5f3307195-kube-api-access-lszfm\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.117515 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.117698 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.117809 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.117879 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.220746 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lszfm\" (UniqueName: \"kubernetes.io/projected/f4175b5d-7866-481a-a923-1ae5f3307195-kube-api-access-lszfm\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.220853 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.220916 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.220986 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.222011 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.226355 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.227948 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.228351 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.228821 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.258683 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lszfm\" (UniqueName: \"kubernetes.io/projected/f4175b5d-7866-481a-a923-1ae5f3307195-kube-api-access-lszfm\") pod \"nova-cell1-novncproxy-0\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.326152 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.840816 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:41:28 crc kubenswrapper[4991]: I1006 08:41:28.945221 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f4175b5d-7866-481a-a923-1ae5f3307195","Type":"ContainerStarted","Data":"45896814893e4b4e27834ab230ff711bd7c04a9188e406652f3bd809ecd5fb5c"} Oct 06 08:41:29 crc kubenswrapper[4991]: I1006 08:41:29.264377 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebecc338-537f-4b31-b992-9cc46c89ea19" path="/var/lib/kubelet/pods/ebecc338-537f-4b31-b992-9cc46c89ea19/volumes" Oct 06 08:41:29 crc kubenswrapper[4991]: I1006 08:41:29.956978 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f4175b5d-7866-481a-a923-1ae5f3307195","Type":"ContainerStarted","Data":"c3b1614500005292c9e7b6920ac4a7cc87e019fd8e824585e552366b6101a5ab"} Oct 06 08:41:29 crc kubenswrapper[4991]: I1006 08:41:29.993387 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.993366802 podStartE2EDuration="2.993366802s" podCreationTimestamp="2025-10-06 08:41:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:29.98327707 +0000 UTC m=+1341.721027111" watchObservedRunningTime="2025-10-06 08:41:29.993366802 +0000 UTC m=+1341.731116823" Oct 06 08:41:30 crc kubenswrapper[4991]: I1006 08:41:30.117569 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 08:41:30 crc kubenswrapper[4991]: I1006 08:41:30.119769 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 08:41:30 crc kubenswrapper[4991]: I1006 08:41:30.120120 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 08:41:30 crc kubenswrapper[4991]: I1006 08:41:30.122565 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 08:41:30 crc kubenswrapper[4991]: I1006 08:41:30.975342 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 08:41:30 crc kubenswrapper[4991]: I1006 08:41:30.984032 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.171148 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-6qdfr"] Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.174568 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.190344 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hs5k\" (UniqueName: \"kubernetes.io/projected/2d06311c-e246-4d3d-ba9c-388cb800ac4f-kube-api-access-6hs5k\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.190396 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.190432 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.190504 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.190531 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-config\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.190585 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.194004 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-6qdfr"] Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.292179 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.292234 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-config\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.292328 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.292403 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hs5k\" (UniqueName: \"kubernetes.io/projected/2d06311c-e246-4d3d-ba9c-388cb800ac4f-kube-api-access-6hs5k\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.292421 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.292455 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.293238 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.303160 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.303255 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-config\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.303327 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.303680 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.317156 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hs5k\" (UniqueName: \"kubernetes.io/projected/2d06311c-e246-4d3d-ba9c-388cb800ac4f-kube-api-access-6hs5k\") pod \"dnsmasq-dns-5c7b6c5df9-6qdfr\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:31 crc kubenswrapper[4991]: I1006 08:41:31.511619 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:32 crc kubenswrapper[4991]: I1006 08:41:32.027160 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-6qdfr"] Oct 06 08:41:32 crc kubenswrapper[4991]: I1006 08:41:32.915587 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:32 crc kubenswrapper[4991]: I1006 08:41:32.916312 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="ceilometer-central-agent" containerID="cri-o://821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b" gracePeriod=30 Oct 06 08:41:32 crc kubenswrapper[4991]: I1006 08:41:32.916432 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="proxy-httpd" containerID="cri-o://5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c" gracePeriod=30 Oct 06 08:41:32 crc kubenswrapper[4991]: I1006 08:41:32.916475 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="sg-core" containerID="cri-o://f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d" gracePeriod=30 Oct 06 08:41:32 crc kubenswrapper[4991]: I1006 08:41:32.916534 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="ceilometer-notification-agent" containerID="cri-o://c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888" gracePeriod=30 Oct 06 08:41:32 crc kubenswrapper[4991]: I1006 08:41:32.927540 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.194:3000/\": EOF" Oct 06 08:41:32 crc kubenswrapper[4991]: I1006 08:41:32.999054 4991 generic.go:334] "Generic (PLEG): container finished" podID="2d06311c-e246-4d3d-ba9c-388cb800ac4f" containerID="1ffa5d61a2db9844f49f116652b16947e4d804b4b63149c5790e80ca525ec7f3" exitCode=0 Oct 06 08:41:33 crc kubenswrapper[4991]: I1006 08:41:33.000407 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" event={"ID":"2d06311c-e246-4d3d-ba9c-388cb800ac4f","Type":"ContainerDied","Data":"1ffa5d61a2db9844f49f116652b16947e4d804b4b63149c5790e80ca525ec7f3"} Oct 06 08:41:33 crc kubenswrapper[4991]: I1006 08:41:33.000439 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" event={"ID":"2d06311c-e246-4d3d-ba9c-388cb800ac4f","Type":"ContainerStarted","Data":"8837db3f318960e4eaf4d8c389f5127148a69eb983ba046be127e70ea5f58f7d"} Oct 06 08:41:33 crc kubenswrapper[4991]: I1006 08:41:33.327521 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:33 crc kubenswrapper[4991]: I1006 08:41:33.612891 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:34 crc kubenswrapper[4991]: I1006 08:41:34.025592 4991 generic.go:334] "Generic (PLEG): container finished" podID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerID="5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c" exitCode=0 Oct 06 08:41:34 crc kubenswrapper[4991]: I1006 08:41:34.025622 4991 generic.go:334] "Generic (PLEG): container finished" podID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerID="f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d" exitCode=2 Oct 06 08:41:34 crc kubenswrapper[4991]: I1006 08:41:34.025631 4991 generic.go:334] "Generic (PLEG): container finished" podID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerID="821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b" exitCode=0 Oct 06 08:41:34 crc kubenswrapper[4991]: I1006 08:41:34.025671 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0414db4-b3f4-4864-87a1-b84ab999cf9b","Type":"ContainerDied","Data":"5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c"} Oct 06 08:41:34 crc kubenswrapper[4991]: I1006 08:41:34.025700 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0414db4-b3f4-4864-87a1-b84ab999cf9b","Type":"ContainerDied","Data":"f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d"} Oct 06 08:41:34 crc kubenswrapper[4991]: I1006 08:41:34.025712 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0414db4-b3f4-4864-87a1-b84ab999cf9b","Type":"ContainerDied","Data":"821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b"} Oct 06 08:41:34 crc kubenswrapper[4991]: I1006 08:41:34.027804 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d242e246-427c-43d9-992e-9175bb2ac3d9" containerName="nova-api-log" containerID="cri-o://0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9" gracePeriod=30 Oct 06 08:41:34 crc kubenswrapper[4991]: I1006 08:41:34.029047 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" event={"ID":"2d06311c-e246-4d3d-ba9c-388cb800ac4f","Type":"ContainerStarted","Data":"d8899a5ea677f40567793637ebd89b430187e401ecc0a4df4ac6944a237de212"} Oct 06 08:41:34 crc kubenswrapper[4991]: I1006 08:41:34.029087 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:34 crc kubenswrapper[4991]: I1006 08:41:34.029447 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d242e246-427c-43d9-992e-9175bb2ac3d9" containerName="nova-api-api" containerID="cri-o://4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2" gracePeriod=30 Oct 06 08:41:34 crc kubenswrapper[4991]: I1006 08:41:34.066212 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" podStartSLOduration=3.066191957 podStartE2EDuration="3.066191957s" podCreationTimestamp="2025-10-06 08:41:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:34.059245623 +0000 UTC m=+1345.796995644" watchObservedRunningTime="2025-10-06 08:41:34.066191957 +0000 UTC m=+1345.803941978" Oct 06 08:41:35 crc kubenswrapper[4991]: I1006 08:41:35.037439 4991 generic.go:334] "Generic (PLEG): container finished" podID="d242e246-427c-43d9-992e-9175bb2ac3d9" containerID="0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9" exitCode=143 Oct 06 08:41:35 crc kubenswrapper[4991]: I1006 08:41:35.037530 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d242e246-427c-43d9-992e-9175bb2ac3d9","Type":"ContainerDied","Data":"0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9"} Oct 06 08:41:36 crc kubenswrapper[4991]: I1006 08:41:36.922226 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:41:36 crc kubenswrapper[4991]: I1006 08:41:36.996936 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0414db4-b3f4-4864-87a1-b84ab999cf9b-run-httpd\") pod \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " Oct 06 08:41:36 crc kubenswrapper[4991]: I1006 08:41:36.997035 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgxg7\" (UniqueName: \"kubernetes.io/projected/c0414db4-b3f4-4864-87a1-b84ab999cf9b-kube-api-access-rgxg7\") pod \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " Oct 06 08:41:36 crc kubenswrapper[4991]: I1006 08:41:36.997074 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-combined-ca-bundle\") pod \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " Oct 06 08:41:36 crc kubenswrapper[4991]: I1006 08:41:36.997106 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-scripts\") pod \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " Oct 06 08:41:36 crc kubenswrapper[4991]: I1006 08:41:36.997137 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-ceilometer-tls-certs\") pod \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " Oct 06 08:41:36 crc kubenswrapper[4991]: I1006 08:41:36.997269 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-config-data\") pod \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " Oct 06 08:41:36 crc kubenswrapper[4991]: I1006 08:41:36.997354 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0414db4-b3f4-4864-87a1-b84ab999cf9b-log-httpd\") pod \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " Oct 06 08:41:36 crc kubenswrapper[4991]: I1006 08:41:36.997380 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-sg-core-conf-yaml\") pod \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\" (UID: \"c0414db4-b3f4-4864-87a1-b84ab999cf9b\") " Oct 06 08:41:36 crc kubenswrapper[4991]: I1006 08:41:36.997592 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0414db4-b3f4-4864-87a1-b84ab999cf9b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c0414db4-b3f4-4864-87a1-b84ab999cf9b" (UID: "c0414db4-b3f4-4864-87a1-b84ab999cf9b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:36 crc kubenswrapper[4991]: I1006 08:41:36.997954 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0414db4-b3f4-4864-87a1-b84ab999cf9b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:36 crc kubenswrapper[4991]: I1006 08:41:36.999094 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0414db4-b3f4-4864-87a1-b84ab999cf9b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c0414db4-b3f4-4864-87a1-b84ab999cf9b" (UID: "c0414db4-b3f4-4864-87a1-b84ab999cf9b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.014095 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0414db4-b3f4-4864-87a1-b84ab999cf9b-kube-api-access-rgxg7" (OuterVolumeSpecName: "kube-api-access-rgxg7") pod "c0414db4-b3f4-4864-87a1-b84ab999cf9b" (UID: "c0414db4-b3f4-4864-87a1-b84ab999cf9b"). InnerVolumeSpecName "kube-api-access-rgxg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.018893 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-scripts" (OuterVolumeSpecName: "scripts") pod "c0414db4-b3f4-4864-87a1-b84ab999cf9b" (UID: "c0414db4-b3f4-4864-87a1-b84ab999cf9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.036263 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c0414db4-b3f4-4864-87a1-b84ab999cf9b" (UID: "c0414db4-b3f4-4864-87a1-b84ab999cf9b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:37 crc kubenswrapper[4991]: E1006 08:41:37.042651 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc71fc75d_0a11_4673_a14d_90f3269ff26f.slice/crio-cfe21ba219347645960b422708293d79554a8bd271b712cf6fad6a029f3aa597\": RecentStats: unable to find data in memory cache]" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.062251 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c0414db4-b3f4-4864-87a1-b84ab999cf9b" (UID: "c0414db4-b3f4-4864-87a1-b84ab999cf9b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.071798 4991 generic.go:334] "Generic (PLEG): container finished" podID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerID="c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888" exitCode=0 Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.071839 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0414db4-b3f4-4864-87a1-b84ab999cf9b","Type":"ContainerDied","Data":"c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888"} Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.071865 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c0414db4-b3f4-4864-87a1-b84ab999cf9b","Type":"ContainerDied","Data":"90f1d13bcc0e0c9c30c21806566067aad6bd00112e618f9aa74fc2f3c3bfe650"} Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.071883 4991 scope.go:117] "RemoveContainer" containerID="5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.072016 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.091238 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0414db4-b3f4-4864-87a1-b84ab999cf9b" (UID: "c0414db4-b3f4-4864-87a1-b84ab999cf9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.099672 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c0414db4-b3f4-4864-87a1-b84ab999cf9b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.099726 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.099736 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgxg7\" (UniqueName: \"kubernetes.io/projected/c0414db4-b3f4-4864-87a1-b84ab999cf9b-kube-api-access-rgxg7\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.099744 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.099752 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.099761 4991 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.106558 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-config-data" (OuterVolumeSpecName: "config-data") pod "c0414db4-b3f4-4864-87a1-b84ab999cf9b" (UID: "c0414db4-b3f4-4864-87a1-b84ab999cf9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.116992 4991 scope.go:117] "RemoveContainer" containerID="f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.134772 4991 scope.go:117] "RemoveContainer" containerID="c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.153846 4991 scope.go:117] "RemoveContainer" containerID="821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.173796 4991 scope.go:117] "RemoveContainer" containerID="5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c" Oct 06 08:41:37 crc kubenswrapper[4991]: E1006 08:41:37.174182 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c\": container with ID starting with 5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c not found: ID does not exist" containerID="5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.174209 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c"} err="failed to get container status \"5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c\": rpc error: code = NotFound desc = could not find container \"5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c\": container with ID starting with 5d378f93fb29da8559c460f81f7b41c90e032410e371bd8f8e1172a320c45f8c not found: ID does not exist" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.174228 4991 scope.go:117] "RemoveContainer" containerID="f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d" Oct 06 08:41:37 crc kubenswrapper[4991]: E1006 08:41:37.174517 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d\": container with ID starting with f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d not found: ID does not exist" containerID="f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.174561 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d"} err="failed to get container status \"f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d\": rpc error: code = NotFound desc = could not find container \"f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d\": container with ID starting with f40943887d8c6d2cc1779c185801d8cadb2597a5c0beab7525f109eacddce14d not found: ID does not exist" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.174591 4991 scope.go:117] "RemoveContainer" containerID="c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888" Oct 06 08:41:37 crc kubenswrapper[4991]: E1006 08:41:37.174859 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888\": container with ID starting with c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888 not found: ID does not exist" containerID="c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.174882 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888"} err="failed to get container status \"c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888\": rpc error: code = NotFound desc = could not find container \"c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888\": container with ID starting with c7f064ab0cd4749106dab8cdfee8156ea52fb7040608773821ee526621461888 not found: ID does not exist" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.174894 4991 scope.go:117] "RemoveContainer" containerID="821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b" Oct 06 08:41:37 crc kubenswrapper[4991]: E1006 08:41:37.175102 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b\": container with ID starting with 821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b not found: ID does not exist" containerID="821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.175126 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b"} err="failed to get container status \"821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b\": rpc error: code = NotFound desc = could not find container \"821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b\": container with ID starting with 821aa051cc0206166cea12724c934809ca93ba0aa52532ecea1e355491f0168b not found: ID does not exist" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.201779 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0414db4-b3f4-4864-87a1-b84ab999cf9b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.411087 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.421795 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.447490 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:37 crc kubenswrapper[4991]: E1006 08:41:37.447857 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="sg-core" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.447876 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="sg-core" Oct 06 08:41:37 crc kubenswrapper[4991]: E1006 08:41:37.447895 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="proxy-httpd" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.447905 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="proxy-httpd" Oct 06 08:41:37 crc kubenswrapper[4991]: E1006 08:41:37.447941 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="ceilometer-central-agent" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.447950 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="ceilometer-central-agent" Oct 06 08:41:37 crc kubenswrapper[4991]: E1006 08:41:37.447967 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="ceilometer-notification-agent" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.447975 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="ceilometer-notification-agent" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.448206 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="ceilometer-notification-agent" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.448226 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="sg-core" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.448238 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="proxy-httpd" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.448252 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" containerName="ceilometer-central-agent" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.449954 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.454078 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.454157 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.455166 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.467076 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.517614 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.609157 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d242e246-427c-43d9-992e-9175bb2ac3d9-combined-ca-bundle\") pod \"d242e246-427c-43d9-992e-9175bb2ac3d9\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.609234 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq489\" (UniqueName: \"kubernetes.io/projected/d242e246-427c-43d9-992e-9175bb2ac3d9-kube-api-access-wq489\") pod \"d242e246-427c-43d9-992e-9175bb2ac3d9\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.610635 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d242e246-427c-43d9-992e-9175bb2ac3d9-config-data\") pod \"d242e246-427c-43d9-992e-9175bb2ac3d9\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.610893 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d242e246-427c-43d9-992e-9175bb2ac3d9-logs\") pod \"d242e246-427c-43d9-992e-9175bb2ac3d9\" (UID: \"d242e246-427c-43d9-992e-9175bb2ac3d9\") " Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.611581 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9160ed8e-9be5-4d38-b9a0-7138dfecc506-log-httpd\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.611648 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qv64\" (UniqueName: \"kubernetes.io/projected/9160ed8e-9be5-4d38-b9a0-7138dfecc506-kube-api-access-5qv64\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.611807 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.611835 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-config-data\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.612008 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9160ed8e-9be5-4d38-b9a0-7138dfecc506-run-httpd\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.612064 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-scripts\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.612161 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.612227 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.612873 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d242e246-427c-43d9-992e-9175bb2ac3d9-logs" (OuterVolumeSpecName: "logs") pod "d242e246-427c-43d9-992e-9175bb2ac3d9" (UID: "d242e246-427c-43d9-992e-9175bb2ac3d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.622365 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d242e246-427c-43d9-992e-9175bb2ac3d9-kube-api-access-wq489" (OuterVolumeSpecName: "kube-api-access-wq489") pod "d242e246-427c-43d9-992e-9175bb2ac3d9" (UID: "d242e246-427c-43d9-992e-9175bb2ac3d9"). InnerVolumeSpecName "kube-api-access-wq489". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.634658 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d242e246-427c-43d9-992e-9175bb2ac3d9-config-data" (OuterVolumeSpecName: "config-data") pod "d242e246-427c-43d9-992e-9175bb2ac3d9" (UID: "d242e246-427c-43d9-992e-9175bb2ac3d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.638494 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d242e246-427c-43d9-992e-9175bb2ac3d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d242e246-427c-43d9-992e-9175bb2ac3d9" (UID: "d242e246-427c-43d9-992e-9175bb2ac3d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714281 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9160ed8e-9be5-4d38-b9a0-7138dfecc506-run-httpd\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714345 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-scripts\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714381 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714409 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714467 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9160ed8e-9be5-4d38-b9a0-7138dfecc506-log-httpd\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714488 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qv64\" (UniqueName: \"kubernetes.io/projected/9160ed8e-9be5-4d38-b9a0-7138dfecc506-kube-api-access-5qv64\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714648 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714717 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-config-data\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714865 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d242e246-427c-43d9-992e-9175bb2ac3d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714886 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d242e246-427c-43d9-992e-9175bb2ac3d9-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714896 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d242e246-427c-43d9-992e-9175bb2ac3d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714909 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq489\" (UniqueName: \"kubernetes.io/projected/d242e246-427c-43d9-992e-9175bb2ac3d9-kube-api-access-wq489\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.714963 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9160ed8e-9be5-4d38-b9a0-7138dfecc506-log-httpd\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.715391 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9160ed8e-9be5-4d38-b9a0-7138dfecc506-run-httpd\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.718337 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.718457 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.718467 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-scripts\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.719098 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.720941 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-config-data\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.734618 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qv64\" (UniqueName: \"kubernetes.io/projected/9160ed8e-9be5-4d38-b9a0-7138dfecc506-kube-api-access-5qv64\") pod \"ceilometer-0\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " pod="openstack/ceilometer-0" Oct 06 08:41:37 crc kubenswrapper[4991]: I1006 08:41:37.771088 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.091814 4991 generic.go:334] "Generic (PLEG): container finished" podID="d242e246-427c-43d9-992e-9175bb2ac3d9" containerID="4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2" exitCode=0 Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.091894 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d242e246-427c-43d9-992e-9175bb2ac3d9","Type":"ContainerDied","Data":"4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2"} Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.092225 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d242e246-427c-43d9-992e-9175bb2ac3d9","Type":"ContainerDied","Data":"4c00256b5a84f9818798f21399fba391c32511c711669cf6b391121e3d593d18"} Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.091915 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.092249 4991 scope.go:117] "RemoveContainer" containerID="4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.128546 4991 scope.go:117] "RemoveContainer" containerID="0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.132020 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.143534 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.158266 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:38 crc kubenswrapper[4991]: E1006 08:41:38.159002 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d242e246-427c-43d9-992e-9175bb2ac3d9" containerName="nova-api-api" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.159047 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d242e246-427c-43d9-992e-9175bb2ac3d9" containerName="nova-api-api" Oct 06 08:41:38 crc kubenswrapper[4991]: E1006 08:41:38.159060 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d242e246-427c-43d9-992e-9175bb2ac3d9" containerName="nova-api-log" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.159068 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d242e246-427c-43d9-992e-9175bb2ac3d9" containerName="nova-api-log" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.159354 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d242e246-427c-43d9-992e-9175bb2ac3d9" containerName="nova-api-api" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.159403 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d242e246-427c-43d9-992e-9175bb2ac3d9" containerName="nova-api-log" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.160905 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.163290 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.163504 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.170243 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.171714 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.198007 4991 scope.go:117] "RemoveContainer" containerID="4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2" Oct 06 08:41:38 crc kubenswrapper[4991]: E1006 08:41:38.198511 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2\": container with ID starting with 4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2 not found: ID does not exist" containerID="4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.198578 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2"} err="failed to get container status \"4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2\": rpc error: code = NotFound desc = could not find container \"4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2\": container with ID starting with 4347c94e53c960ab65a4cf02806f861428e5d33e4efb9724d0348f52586da8f2 not found: ID does not exist" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.198602 4991 scope.go:117] "RemoveContainer" containerID="0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9" Oct 06 08:41:38 crc kubenswrapper[4991]: E1006 08:41:38.205131 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9\": container with ID starting with 0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9 not found: ID does not exist" containerID="0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.205184 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9"} err="failed to get container status \"0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9\": rpc error: code = NotFound desc = could not find container \"0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9\": container with ID starting with 0a7b8f10841910f50e7fdf30942ce67eead22a91c00f3eda4d4cc59ddbab87a9 not found: ID does not exist" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.241641 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.327148 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.330778 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.330837 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4hgg\" (UniqueName: \"kubernetes.io/projected/a0af55a4-2578-4675-9357-227310f62846-kube-api-access-x4hgg\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.330863 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.330950 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0af55a4-2578-4675-9357-227310f62846-logs\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.330974 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.330988 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-config-data\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.347875 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.432747 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.432814 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4hgg\" (UniqueName: \"kubernetes.io/projected/a0af55a4-2578-4675-9357-227310f62846-kube-api-access-x4hgg\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.432843 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.432911 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0af55a4-2578-4675-9357-227310f62846-logs\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.432932 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.433583 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-config-data\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.433880 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0af55a4-2578-4675-9357-227310f62846-logs\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.438734 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.438805 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.438837 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-config-data\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.439268 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.453364 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4hgg\" (UniqueName: \"kubernetes.io/projected/a0af55a4-2578-4675-9357-227310f62846-kube-api-access-x4hgg\") pod \"nova-api-0\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.501785 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:38 crc kubenswrapper[4991]: I1006 08:41:38.947100 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:38 crc kubenswrapper[4991]: W1006 08:41:38.952673 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0af55a4_2578_4675_9357_227310f62846.slice/crio-f8b2f35540ce7860f14a207d0832d37255b59e5db3ec2b712b691f78ede8db58 WatchSource:0}: Error finding container f8b2f35540ce7860f14a207d0832d37255b59e5db3ec2b712b691f78ede8db58: Status 404 returned error can't find the container with id f8b2f35540ce7860f14a207d0832d37255b59e5db3ec2b712b691f78ede8db58 Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.110098 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0af55a4-2578-4675-9357-227310f62846","Type":"ContainerStarted","Data":"f8b2f35540ce7860f14a207d0832d37255b59e5db3ec2b712b691f78ede8db58"} Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.114716 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9160ed8e-9be5-4d38-b9a0-7138dfecc506","Type":"ContainerStarted","Data":"5649b7a21637b90f14805bd6598872f1dc5add87d3dbf7ebd86b1d9b50e3aaf8"} Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.114773 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9160ed8e-9be5-4d38-b9a0-7138dfecc506","Type":"ContainerStarted","Data":"9fb4f55ebe69e581e9ff3c3ba36973801afec20da07defb9f5e458675c816932"} Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.137394 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.279342 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0414db4-b3f4-4864-87a1-b84ab999cf9b" path="/var/lib/kubelet/pods/c0414db4-b3f4-4864-87a1-b84ab999cf9b/volumes" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.280476 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d242e246-427c-43d9-992e-9175bb2ac3d9" path="/var/lib/kubelet/pods/d242e246-427c-43d9-992e-9175bb2ac3d9/volumes" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.281082 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5prnr"] Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.282673 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.284585 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.286837 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.288144 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5prnr"] Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.460488 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5prnr\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.460592 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cqwn\" (UniqueName: \"kubernetes.io/projected/4926b604-c132-46b9-a156-46ae1662bc9d-kube-api-access-6cqwn\") pod \"nova-cell1-cell-mapping-5prnr\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.460659 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-scripts\") pod \"nova-cell1-cell-mapping-5prnr\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.460697 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-config-data\") pod \"nova-cell1-cell-mapping-5prnr\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.562617 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-scripts\") pod \"nova-cell1-cell-mapping-5prnr\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.562681 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-config-data\") pod \"nova-cell1-cell-mapping-5prnr\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.562727 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5prnr\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.562807 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cqwn\" (UniqueName: \"kubernetes.io/projected/4926b604-c132-46b9-a156-46ae1662bc9d-kube-api-access-6cqwn\") pod \"nova-cell1-cell-mapping-5prnr\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.566240 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5prnr\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.568237 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-config-data\") pod \"nova-cell1-cell-mapping-5prnr\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.568367 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-scripts\") pod \"nova-cell1-cell-mapping-5prnr\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.611847 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cqwn\" (UniqueName: \"kubernetes.io/projected/4926b604-c132-46b9-a156-46ae1662bc9d-kube-api-access-6cqwn\") pod \"nova-cell1-cell-mapping-5prnr\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:39 crc kubenswrapper[4991]: I1006 08:41:39.615542 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:40 crc kubenswrapper[4991]: I1006 08:41:40.060406 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5prnr"] Oct 06 08:41:40 crc kubenswrapper[4991]: I1006 08:41:40.133802 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0af55a4-2578-4675-9357-227310f62846","Type":"ContainerStarted","Data":"e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11"} Oct 06 08:41:40 crc kubenswrapper[4991]: I1006 08:41:40.134203 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0af55a4-2578-4675-9357-227310f62846","Type":"ContainerStarted","Data":"1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7"} Oct 06 08:41:40 crc kubenswrapper[4991]: I1006 08:41:40.135666 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5prnr" event={"ID":"4926b604-c132-46b9-a156-46ae1662bc9d","Type":"ContainerStarted","Data":"8b909a2553ee3a3197563a91e90c7c01cae7153b92bdf8858f029a59544b7f50"} Oct 06 08:41:40 crc kubenswrapper[4991]: I1006 08:41:40.138066 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9160ed8e-9be5-4d38-b9a0-7138dfecc506","Type":"ContainerStarted","Data":"73a68671b100b61405ad43b9f475805bafe221206d22cbb30d84e206645d9fff"} Oct 06 08:41:40 crc kubenswrapper[4991]: I1006 08:41:40.161229 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.161200531 podStartE2EDuration="2.161200531s" podCreationTimestamp="2025-10-06 08:41:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:40.158132255 +0000 UTC m=+1351.895882316" watchObservedRunningTime="2025-10-06 08:41:40.161200531 +0000 UTC m=+1351.898950582" Oct 06 08:41:41 crc kubenswrapper[4991]: I1006 08:41:41.151979 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5prnr" event={"ID":"4926b604-c132-46b9-a156-46ae1662bc9d","Type":"ContainerStarted","Data":"d7106d512c69044297389f1917afcf12abd789ac37bc7313f94e87abdc2dd932"} Oct 06 08:41:41 crc kubenswrapper[4991]: I1006 08:41:41.161707 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9160ed8e-9be5-4d38-b9a0-7138dfecc506","Type":"ContainerStarted","Data":"0d50a1a51fb76b15129da6a601cd7086c571cfcb88c3e73e801b6de71d603ab5"} Oct 06 08:41:41 crc kubenswrapper[4991]: I1006 08:41:41.184610 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5prnr" podStartSLOduration=2.184589654 podStartE2EDuration="2.184589654s" podCreationTimestamp="2025-10-06 08:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:41.18085408 +0000 UTC m=+1352.918604101" watchObservedRunningTime="2025-10-06 08:41:41.184589654 +0000 UTC m=+1352.922339685" Oct 06 08:41:41 crc kubenswrapper[4991]: I1006 08:41:41.514034 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:41:41 crc kubenswrapper[4991]: I1006 08:41:41.683618 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-rtlp9"] Oct 06 08:41:41 crc kubenswrapper[4991]: I1006 08:41:41.684155 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" podUID="a4f23dd7-0459-4c71-86af-7b589d466e9d" containerName="dnsmasq-dns" containerID="cri-o://90cd40de1bdce0b9010647126aca623edc030c70b8cf3e25c080af7f4f0d06b5" gracePeriod=10 Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.173173 4991 generic.go:334] "Generic (PLEG): container finished" podID="a4f23dd7-0459-4c71-86af-7b589d466e9d" containerID="90cd40de1bdce0b9010647126aca623edc030c70b8cf3e25c080af7f4f0d06b5" exitCode=0 Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.173241 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" event={"ID":"a4f23dd7-0459-4c71-86af-7b589d466e9d","Type":"ContainerDied","Data":"90cd40de1bdce0b9010647126aca623edc030c70b8cf3e25c080af7f4f0d06b5"} Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.173495 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" event={"ID":"a4f23dd7-0459-4c71-86af-7b589d466e9d","Type":"ContainerDied","Data":"c40b4d594cbc2a44239e5ded1aca2ab811b0840e59efb067f1c2cab61e24e0ca"} Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.173510 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c40b4d594cbc2a44239e5ded1aca2ab811b0840e59efb067f1c2cab61e24e0ca" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.181127 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9160ed8e-9be5-4d38-b9a0-7138dfecc506","Type":"ContainerStarted","Data":"f562f33940beff3b6675303b795947ca3b1c6407fc9bbc1512ece13bd67badb0"} Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.181181 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.259313 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.277567 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.0861828510000002 podStartE2EDuration="5.277552575s" podCreationTimestamp="2025-10-06 08:41:37 +0000 UTC" firstStartedPulling="2025-10-06 08:41:38.243479385 +0000 UTC m=+1349.981229406" lastFinishedPulling="2025-10-06 08:41:41.434849089 +0000 UTC m=+1353.172599130" observedRunningTime="2025-10-06 08:41:42.2083959 +0000 UTC m=+1353.946145951" watchObservedRunningTime="2025-10-06 08:41:42.277552575 +0000 UTC m=+1354.015302596" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.328380 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-ovsdbserver-nb\") pod \"a4f23dd7-0459-4c71-86af-7b589d466e9d\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.328526 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-ovsdbserver-sb\") pod \"a4f23dd7-0459-4c71-86af-7b589d466e9d\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.328568 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-dns-swift-storage-0\") pod \"a4f23dd7-0459-4c71-86af-7b589d466e9d\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.328610 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-dns-svc\") pod \"a4f23dd7-0459-4c71-86af-7b589d466e9d\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.328748 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2z4f\" (UniqueName: \"kubernetes.io/projected/a4f23dd7-0459-4c71-86af-7b589d466e9d-kube-api-access-j2z4f\") pod \"a4f23dd7-0459-4c71-86af-7b589d466e9d\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.328818 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-config\") pod \"a4f23dd7-0459-4c71-86af-7b589d466e9d\" (UID: \"a4f23dd7-0459-4c71-86af-7b589d466e9d\") " Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.334658 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f23dd7-0459-4c71-86af-7b589d466e9d-kube-api-access-j2z4f" (OuterVolumeSpecName: "kube-api-access-j2z4f") pod "a4f23dd7-0459-4c71-86af-7b589d466e9d" (UID: "a4f23dd7-0459-4c71-86af-7b589d466e9d"). InnerVolumeSpecName "kube-api-access-j2z4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.378194 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-config" (OuterVolumeSpecName: "config") pod "a4f23dd7-0459-4c71-86af-7b589d466e9d" (UID: "a4f23dd7-0459-4c71-86af-7b589d466e9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.383548 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a4f23dd7-0459-4c71-86af-7b589d466e9d" (UID: "a4f23dd7-0459-4c71-86af-7b589d466e9d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.388768 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a4f23dd7-0459-4c71-86af-7b589d466e9d" (UID: "a4f23dd7-0459-4c71-86af-7b589d466e9d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.400817 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4f23dd7-0459-4c71-86af-7b589d466e9d" (UID: "a4f23dd7-0459-4c71-86af-7b589d466e9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.407132 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a4f23dd7-0459-4c71-86af-7b589d466e9d" (UID: "a4f23dd7-0459-4c71-86af-7b589d466e9d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.431228 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2z4f\" (UniqueName: \"kubernetes.io/projected/a4f23dd7-0459-4c71-86af-7b589d466e9d-kube-api-access-j2z4f\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.431265 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.431277 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.431285 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.431310 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:42 crc kubenswrapper[4991]: I1006 08:41:42.431321 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4f23dd7-0459-4c71-86af-7b589d466e9d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:43 crc kubenswrapper[4991]: I1006 08:41:43.188237 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" Oct 06 08:41:43 crc kubenswrapper[4991]: I1006 08:41:43.232466 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-rtlp9"] Oct 06 08:41:43 crc kubenswrapper[4991]: I1006 08:41:43.241776 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-rtlp9"] Oct 06 08:41:43 crc kubenswrapper[4991]: I1006 08:41:43.256147 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4f23dd7-0459-4c71-86af-7b589d466e9d" path="/var/lib/kubelet/pods/a4f23dd7-0459-4c71-86af-7b589d466e9d/volumes" Oct 06 08:41:45 crc kubenswrapper[4991]: I1006 08:41:45.223691 4991 generic.go:334] "Generic (PLEG): container finished" podID="4926b604-c132-46b9-a156-46ae1662bc9d" containerID="d7106d512c69044297389f1917afcf12abd789ac37bc7313f94e87abdc2dd932" exitCode=0 Oct 06 08:41:45 crc kubenswrapper[4991]: I1006 08:41:45.223852 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5prnr" event={"ID":"4926b604-c132-46b9-a156-46ae1662bc9d","Type":"ContainerDied","Data":"d7106d512c69044297389f1917afcf12abd789ac37bc7313f94e87abdc2dd932"} Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.650970 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.718337 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-config-data\") pod \"4926b604-c132-46b9-a156-46ae1662bc9d\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.718947 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-combined-ca-bundle\") pod \"4926b604-c132-46b9-a156-46ae1662bc9d\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.719073 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-scripts\") pod \"4926b604-c132-46b9-a156-46ae1662bc9d\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.719198 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cqwn\" (UniqueName: \"kubernetes.io/projected/4926b604-c132-46b9-a156-46ae1662bc9d-kube-api-access-6cqwn\") pod \"4926b604-c132-46b9-a156-46ae1662bc9d\" (UID: \"4926b604-c132-46b9-a156-46ae1662bc9d\") " Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.723984 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-scripts" (OuterVolumeSpecName: "scripts") pod "4926b604-c132-46b9-a156-46ae1662bc9d" (UID: "4926b604-c132-46b9-a156-46ae1662bc9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.724652 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4926b604-c132-46b9-a156-46ae1662bc9d-kube-api-access-6cqwn" (OuterVolumeSpecName: "kube-api-access-6cqwn") pod "4926b604-c132-46b9-a156-46ae1662bc9d" (UID: "4926b604-c132-46b9-a156-46ae1662bc9d"). InnerVolumeSpecName "kube-api-access-6cqwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.746030 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-config-data" (OuterVolumeSpecName: "config-data") pod "4926b604-c132-46b9-a156-46ae1662bc9d" (UID: "4926b604-c132-46b9-a156-46ae1662bc9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.756109 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4926b604-c132-46b9-a156-46ae1662bc9d" (UID: "4926b604-c132-46b9-a156-46ae1662bc9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.821313 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.821342 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.821352 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cqwn\" (UniqueName: \"kubernetes.io/projected/4926b604-c132-46b9-a156-46ae1662bc9d-kube-api-access-6cqwn\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.821362 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4926b604-c132-46b9-a156-46ae1662bc9d-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:46 crc kubenswrapper[4991]: I1006 08:41:46.982774 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-865f5d856f-rtlp9" podUID="a4f23dd7-0459-4c71-86af-7b589d466e9d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.186:5353: i/o timeout" Oct 06 08:41:47 crc kubenswrapper[4991]: I1006 08:41:47.247616 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5prnr" Oct 06 08:41:47 crc kubenswrapper[4991]: I1006 08:41:47.260326 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5prnr" event={"ID":"4926b604-c132-46b9-a156-46ae1662bc9d","Type":"ContainerDied","Data":"8b909a2553ee3a3197563a91e90c7c01cae7153b92bdf8858f029a59544b7f50"} Oct 06 08:41:47 crc kubenswrapper[4991]: I1006 08:41:47.260561 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b909a2553ee3a3197563a91e90c7c01cae7153b92bdf8858f029a59544b7f50" Oct 06 08:41:47 crc kubenswrapper[4991]: E1006 08:41:47.363280 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc71fc75d_0a11_4673_a14d_90f3269ff26f.slice/crio-cfe21ba219347645960b422708293d79554a8bd271b712cf6fad6a029f3aa597\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4926b604_c132_46b9_a156_46ae1662bc9d.slice\": RecentStats: unable to find data in memory cache]" Oct 06 08:41:47 crc kubenswrapper[4991]: I1006 08:41:47.426481 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:47 crc kubenswrapper[4991]: I1006 08:41:47.426930 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f59c5f21-de13-43af-94e4-a2fc82169e33" containerName="nova-scheduler-scheduler" containerID="cri-o://5f56329edbf132723d4798cb7d65ed87d6644c63faa775d835afbc6a4ee41572" gracePeriod=30 Oct 06 08:41:47 crc kubenswrapper[4991]: I1006 08:41:47.445412 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:47 crc kubenswrapper[4991]: I1006 08:41:47.445654 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a0af55a4-2578-4675-9357-227310f62846" containerName="nova-api-log" containerID="cri-o://1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7" gracePeriod=30 Oct 06 08:41:47 crc kubenswrapper[4991]: I1006 08:41:47.445815 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a0af55a4-2578-4675-9357-227310f62846" containerName="nova-api-api" containerID="cri-o://e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11" gracePeriod=30 Oct 06 08:41:47 crc kubenswrapper[4991]: I1006 08:41:47.461282 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:47 crc kubenswrapper[4991]: I1006 08:41:47.462470 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerName="nova-metadata-metadata" containerID="cri-o://8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827" gracePeriod=30 Oct 06 08:41:47 crc kubenswrapper[4991]: I1006 08:41:47.462695 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerName="nova-metadata-log" containerID="cri-o://180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23" gracePeriod=30 Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.015898 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.066841 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0af55a4-2578-4675-9357-227310f62846-logs\") pod \"a0af55a4-2578-4675-9357-227310f62846\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.066908 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-config-data\") pod \"a0af55a4-2578-4675-9357-227310f62846\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.066966 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4hgg\" (UniqueName: \"kubernetes.io/projected/a0af55a4-2578-4675-9357-227310f62846-kube-api-access-x4hgg\") pod \"a0af55a4-2578-4675-9357-227310f62846\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.067106 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-combined-ca-bundle\") pod \"a0af55a4-2578-4675-9357-227310f62846\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.067149 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-public-tls-certs\") pod \"a0af55a4-2578-4675-9357-227310f62846\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.067168 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-internal-tls-certs\") pod \"a0af55a4-2578-4675-9357-227310f62846\" (UID: \"a0af55a4-2578-4675-9357-227310f62846\") " Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.074315 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0af55a4-2578-4675-9357-227310f62846-logs" (OuterVolumeSpecName: "logs") pod "a0af55a4-2578-4675-9357-227310f62846" (UID: "a0af55a4-2578-4675-9357-227310f62846"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.074508 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0af55a4-2578-4675-9357-227310f62846-kube-api-access-x4hgg" (OuterVolumeSpecName: "kube-api-access-x4hgg") pod "a0af55a4-2578-4675-9357-227310f62846" (UID: "a0af55a4-2578-4675-9357-227310f62846"). InnerVolumeSpecName "kube-api-access-x4hgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.119446 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0af55a4-2578-4675-9357-227310f62846" (UID: "a0af55a4-2578-4675-9357-227310f62846"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.143277 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-config-data" (OuterVolumeSpecName: "config-data") pod "a0af55a4-2578-4675-9357-227310f62846" (UID: "a0af55a4-2578-4675-9357-227310f62846"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.144980 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a0af55a4-2578-4675-9357-227310f62846" (UID: "a0af55a4-2578-4675-9357-227310f62846"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.146454 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a0af55a4-2578-4675-9357-227310f62846" (UID: "a0af55a4-2578-4675-9357-227310f62846"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.169184 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4hgg\" (UniqueName: \"kubernetes.io/projected/a0af55a4-2578-4675-9357-227310f62846-kube-api-access-x4hgg\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.169217 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.169229 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.169238 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.169271 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0af55a4-2578-4675-9357-227310f62846-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.169282 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0af55a4-2578-4675-9357-227310f62846-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.257968 4991 generic.go:334] "Generic (PLEG): container finished" podID="a0af55a4-2578-4675-9357-227310f62846" containerID="e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11" exitCode=0 Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.258001 4991 generic.go:334] "Generic (PLEG): container finished" podID="a0af55a4-2578-4675-9357-227310f62846" containerID="1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7" exitCode=143 Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.258049 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.258041 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0af55a4-2578-4675-9357-227310f62846","Type":"ContainerDied","Data":"e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11"} Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.258197 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0af55a4-2578-4675-9357-227310f62846","Type":"ContainerDied","Data":"1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7"} Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.258213 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a0af55a4-2578-4675-9357-227310f62846","Type":"ContainerDied","Data":"f8b2f35540ce7860f14a207d0832d37255b59e5db3ec2b712b691f78ede8db58"} Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.258235 4991 scope.go:117] "RemoveContainer" containerID="e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.260378 4991 generic.go:334] "Generic (PLEG): container finished" podID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerID="180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23" exitCode=143 Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.260487 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"704c9be7-2e65-4018-9388-7f75e8f4dcd6","Type":"ContainerDied","Data":"180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23"} Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.285344 4991 scope.go:117] "RemoveContainer" containerID="1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.311639 4991 scope.go:117] "RemoveContainer" containerID="e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.313114 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:48 crc kubenswrapper[4991]: E1006 08:41:48.313456 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11\": container with ID starting with e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11 not found: ID does not exist" containerID="e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.313516 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11"} err="failed to get container status \"e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11\": rpc error: code = NotFound desc = could not find container \"e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11\": container with ID starting with e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11 not found: ID does not exist" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.313548 4991 scope.go:117] "RemoveContainer" containerID="1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7" Oct 06 08:41:48 crc kubenswrapper[4991]: E1006 08:41:48.314691 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7\": container with ID starting with 1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7 not found: ID does not exist" containerID="1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.314731 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7"} err="failed to get container status \"1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7\": rpc error: code = NotFound desc = could not find container \"1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7\": container with ID starting with 1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7 not found: ID does not exist" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.314764 4991 scope.go:117] "RemoveContainer" containerID="e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.316076 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11"} err="failed to get container status \"e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11\": rpc error: code = NotFound desc = could not find container \"e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11\": container with ID starting with e0f9f05677121d4f98ec0da30f71b982912ab9c00c5e37d63214dd9f4e275a11 not found: ID does not exist" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.316139 4991 scope.go:117] "RemoveContainer" containerID="1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.316475 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7"} err="failed to get container status \"1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7\": rpc error: code = NotFound desc = could not find container \"1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7\": container with ID starting with 1a28c85e7873d6747e87a5f5867066c9ff9a28590ca8ba68558de7fe1096b4a7 not found: ID does not exist" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.332911 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.341476 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:48 crc kubenswrapper[4991]: E1006 08:41:48.341835 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0af55a4-2578-4675-9357-227310f62846" containerName="nova-api-log" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.341850 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0af55a4-2578-4675-9357-227310f62846" containerName="nova-api-log" Oct 06 08:41:48 crc kubenswrapper[4991]: E1006 08:41:48.341886 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f23dd7-0459-4c71-86af-7b589d466e9d" containerName="dnsmasq-dns" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.341894 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f23dd7-0459-4c71-86af-7b589d466e9d" containerName="dnsmasq-dns" Oct 06 08:41:48 crc kubenswrapper[4991]: E1006 08:41:48.341901 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4926b604-c132-46b9-a156-46ae1662bc9d" containerName="nova-manage" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.341907 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4926b604-c132-46b9-a156-46ae1662bc9d" containerName="nova-manage" Oct 06 08:41:48 crc kubenswrapper[4991]: E1006 08:41:48.341916 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f23dd7-0459-4c71-86af-7b589d466e9d" containerName="init" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.341922 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f23dd7-0459-4c71-86af-7b589d466e9d" containerName="init" Oct 06 08:41:48 crc kubenswrapper[4991]: E1006 08:41:48.341931 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0af55a4-2578-4675-9357-227310f62846" containerName="nova-api-api" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.341937 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0af55a4-2578-4675-9357-227310f62846" containerName="nova-api-api" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.342110 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0af55a4-2578-4675-9357-227310f62846" containerName="nova-api-log" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.342130 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f23dd7-0459-4c71-86af-7b589d466e9d" containerName="dnsmasq-dns" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.342137 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4926b604-c132-46b9-a156-46ae1662bc9d" containerName="nova-manage" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.342156 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0af55a4-2578-4675-9357-227310f62846" containerName="nova-api-api" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.343122 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.345940 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.346115 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.346408 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.351059 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.473383 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.473461 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jdfx\" (UniqueName: \"kubernetes.io/projected/23e696d7-7767-4a92-9828-a189ffb52275-kube-api-access-7jdfx\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.473487 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-public-tls-certs\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.473603 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e696d7-7767-4a92-9828-a189ffb52275-logs\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.473642 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-config-data\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.473695 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-internal-tls-certs\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.575945 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jdfx\" (UniqueName: \"kubernetes.io/projected/23e696d7-7767-4a92-9828-a189ffb52275-kube-api-access-7jdfx\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.576019 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-public-tls-certs\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.576173 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e696d7-7767-4a92-9828-a189ffb52275-logs\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.576237 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-config-data\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.576365 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-internal-tls-certs\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.576496 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.576955 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e696d7-7767-4a92-9828-a189ffb52275-logs\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.580575 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-config-data\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.580638 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-internal-tls-certs\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.586816 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-public-tls-certs\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.587844 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.603182 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jdfx\" (UniqueName: \"kubernetes.io/projected/23e696d7-7767-4a92-9828-a189ffb52275-kube-api-access-7jdfx\") pod \"nova-api-0\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " pod="openstack/nova-api-0" Oct 06 08:41:48 crc kubenswrapper[4991]: I1006 08:41:48.678624 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:41:49 crc kubenswrapper[4991]: I1006 08:41:49.147554 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:41:49 crc kubenswrapper[4991]: I1006 08:41:49.260652 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0af55a4-2578-4675-9357-227310f62846" path="/var/lib/kubelet/pods/a0af55a4-2578-4675-9357-227310f62846/volumes" Oct 06 08:41:49 crc kubenswrapper[4991]: I1006 08:41:49.277882 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23e696d7-7767-4a92-9828-a189ffb52275","Type":"ContainerStarted","Data":"858b1c9988c6c99d9be9614393b9d3088930a19a6747938253299bfda4441743"} Oct 06 08:41:50 crc kubenswrapper[4991]: I1006 08:41:50.294891 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23e696d7-7767-4a92-9828-a189ffb52275","Type":"ContainerStarted","Data":"98d4fcdfdc9774dff4624bf92e206f1e36780461435c0e70b7a79655aa1bd813"} Oct 06 08:41:50 crc kubenswrapper[4991]: I1006 08:41:50.295153 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23e696d7-7767-4a92-9828-a189ffb52275","Type":"ContainerStarted","Data":"832edd5d33c524ced05fce73559b98b910c69bcaa4f037231d4db46add5712d9"} Oct 06 08:41:50 crc kubenswrapper[4991]: I1006 08:41:50.325730 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.325702393 podStartE2EDuration="2.325702393s" podCreationTimestamp="2025-10-06 08:41:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:50.310868388 +0000 UTC m=+1362.048618469" watchObservedRunningTime="2025-10-06 08:41:50.325702393 +0000 UTC m=+1362.063452444" Oct 06 08:41:50 crc kubenswrapper[4991]: I1006 08:41:50.618619 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": read tcp 10.217.0.2:54226->10.217.0.190:8775: read: connection reset by peer" Oct 06 08:41:50 crc kubenswrapper[4991]: I1006 08:41:50.618641 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": read tcp 10.217.0.2:54232->10.217.0.190:8775: read: connection reset by peer" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.111535 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.225891 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-config-data\") pod \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.226219 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-combined-ca-bundle\") pod \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.226242 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2vz5\" (UniqueName: \"kubernetes.io/projected/704c9be7-2e65-4018-9388-7f75e8f4dcd6-kube-api-access-t2vz5\") pod \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.226269 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-nova-metadata-tls-certs\") pod \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.226403 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/704c9be7-2e65-4018-9388-7f75e8f4dcd6-logs\") pod \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\" (UID: \"704c9be7-2e65-4018-9388-7f75e8f4dcd6\") " Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.227370 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704c9be7-2e65-4018-9388-7f75e8f4dcd6-logs" (OuterVolumeSpecName: "logs") pod "704c9be7-2e65-4018-9388-7f75e8f4dcd6" (UID: "704c9be7-2e65-4018-9388-7f75e8f4dcd6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.232326 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/704c9be7-2e65-4018-9388-7f75e8f4dcd6-kube-api-access-t2vz5" (OuterVolumeSpecName: "kube-api-access-t2vz5") pod "704c9be7-2e65-4018-9388-7f75e8f4dcd6" (UID: "704c9be7-2e65-4018-9388-7f75e8f4dcd6"). InnerVolumeSpecName "kube-api-access-t2vz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.261447 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-config-data" (OuterVolumeSpecName: "config-data") pod "704c9be7-2e65-4018-9388-7f75e8f4dcd6" (UID: "704c9be7-2e65-4018-9388-7f75e8f4dcd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.277967 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "704c9be7-2e65-4018-9388-7f75e8f4dcd6" (UID: "704c9be7-2e65-4018-9388-7f75e8f4dcd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.301886 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "704c9be7-2e65-4018-9388-7f75e8f4dcd6" (UID: "704c9be7-2e65-4018-9388-7f75e8f4dcd6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.318454 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.318918 4991 generic.go:334] "Generic (PLEG): container finished" podID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerID="8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827" exitCode=0 Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.318963 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.318992 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"704c9be7-2e65-4018-9388-7f75e8f4dcd6","Type":"ContainerDied","Data":"8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827"} Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.319022 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"704c9be7-2e65-4018-9388-7f75e8f4dcd6","Type":"ContainerDied","Data":"53782fc04dbb7a90a25c9515e5478fb1738c707fc10f55c2b7c8aef803e82a1c"} Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.319040 4991 scope.go:117] "RemoveContainer" containerID="8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.323892 4991 generic.go:334] "Generic (PLEG): container finished" podID="f59c5f21-de13-43af-94e4-a2fc82169e33" containerID="5f56329edbf132723d4798cb7d65ed87d6644c63faa775d835afbc6a4ee41572" exitCode=0 Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.328122 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.328270 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f59c5f21-de13-43af-94e4-a2fc82169e33","Type":"ContainerDied","Data":"5f56329edbf132723d4798cb7d65ed87d6644c63faa775d835afbc6a4ee41572"} Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.330072 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/704c9be7-2e65-4018-9388-7f75e8f4dcd6-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.330094 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.330103 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.330114 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2vz5\" (UniqueName: \"kubernetes.io/projected/704c9be7-2e65-4018-9388-7f75e8f4dcd6-kube-api-access-t2vz5\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.330125 4991 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/704c9be7-2e65-4018-9388-7f75e8f4dcd6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.395462 4991 scope.go:117] "RemoveContainer" containerID="180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.437097 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r67b\" (UniqueName: \"kubernetes.io/projected/f59c5f21-de13-43af-94e4-a2fc82169e33-kube-api-access-6r67b\") pod \"f59c5f21-de13-43af-94e4-a2fc82169e33\" (UID: \"f59c5f21-de13-43af-94e4-a2fc82169e33\") " Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.437275 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59c5f21-de13-43af-94e4-a2fc82169e33-combined-ca-bundle\") pod \"f59c5f21-de13-43af-94e4-a2fc82169e33\" (UID: \"f59c5f21-de13-43af-94e4-a2fc82169e33\") " Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.437443 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f59c5f21-de13-43af-94e4-a2fc82169e33-config-data\") pod \"f59c5f21-de13-43af-94e4-a2fc82169e33\" (UID: \"f59c5f21-de13-43af-94e4-a2fc82169e33\") " Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.438998 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.450480 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.454367 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59c5f21-de13-43af-94e4-a2fc82169e33-kube-api-access-6r67b" (OuterVolumeSpecName: "kube-api-access-6r67b") pod "f59c5f21-de13-43af-94e4-a2fc82169e33" (UID: "f59c5f21-de13-43af-94e4-a2fc82169e33"). InnerVolumeSpecName "kube-api-access-6r67b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.466431 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f59c5f21-de13-43af-94e4-a2fc82169e33-config-data" (OuterVolumeSpecName: "config-data") pod "f59c5f21-de13-43af-94e4-a2fc82169e33" (UID: "f59c5f21-de13-43af-94e4-a2fc82169e33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.467330 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:51 crc kubenswrapper[4991]: E1006 08:41:51.467722 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerName="nova-metadata-log" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.467735 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerName="nova-metadata-log" Oct 06 08:41:51 crc kubenswrapper[4991]: E1006 08:41:51.467749 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f59c5f21-de13-43af-94e4-a2fc82169e33" containerName="nova-scheduler-scheduler" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.467755 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59c5f21-de13-43af-94e4-a2fc82169e33" containerName="nova-scheduler-scheduler" Oct 06 08:41:51 crc kubenswrapper[4991]: E1006 08:41:51.467778 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerName="nova-metadata-metadata" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.467783 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerName="nova-metadata-metadata" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.467946 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerName="nova-metadata-metadata" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.467962 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" containerName="nova-metadata-log" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.467984 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f59c5f21-de13-43af-94e4-a2fc82169e33" containerName="nova-scheduler-scheduler" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.468463 4991 scope.go:117] "RemoveContainer" containerID="8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.469130 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: E1006 08:41:51.470165 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827\": container with ID starting with 8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827 not found: ID does not exist" containerID="8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.470193 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827"} err="failed to get container status \"8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827\": rpc error: code = NotFound desc = could not find container \"8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827\": container with ID starting with 8e98471c6638eca5bda79b53d5fbdb72daf5c49bfab55850b2b7d45dce526827 not found: ID does not exist" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.470214 4991 scope.go:117] "RemoveContainer" containerID="180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23" Oct 06 08:41:51 crc kubenswrapper[4991]: E1006 08:41:51.470829 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23\": container with ID starting with 180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23 not found: ID does not exist" containerID="180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.473545 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23"} err="failed to get container status \"180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23\": rpc error: code = NotFound desc = could not find container \"180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23\": container with ID starting with 180bacb28f76219a101307fdb32e0a770f3965f49cc7737bb83c3f7eadda4c23 not found: ID does not exist" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.473575 4991 scope.go:117] "RemoveContainer" containerID="5f56329edbf132723d4798cb7d65ed87d6644c63faa775d835afbc6a4ee41572" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.471784 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.471854 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.476065 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.492489 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f59c5f21-de13-43af-94e4-a2fc82169e33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f59c5f21-de13-43af-94e4-a2fc82169e33" (UID: "f59c5f21-de13-43af-94e4-a2fc82169e33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.541790 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e2b1c5-03aa-4472-9002-7daf936edc67-logs\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.541861 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.541898 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-config-data\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.541983 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.542031 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlwln\" (UniqueName: \"kubernetes.io/projected/70e2b1c5-03aa-4472-9002-7daf936edc67-kube-api-access-hlwln\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.542147 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f59c5f21-de13-43af-94e4-a2fc82169e33-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.542168 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r67b\" (UniqueName: \"kubernetes.io/projected/f59c5f21-de13-43af-94e4-a2fc82169e33-kube-api-access-6r67b\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.542182 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59c5f21-de13-43af-94e4-a2fc82169e33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.644016 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.644068 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlwln\" (UniqueName: \"kubernetes.io/projected/70e2b1c5-03aa-4472-9002-7daf936edc67-kube-api-access-hlwln\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.644176 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e2b1c5-03aa-4472-9002-7daf936edc67-logs\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.644199 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.644219 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-config-data\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.645625 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e2b1c5-03aa-4472-9002-7daf936edc67-logs\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.649327 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.650339 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-config-data\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.654569 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.673036 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.676087 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlwln\" (UniqueName: \"kubernetes.io/projected/70e2b1c5-03aa-4472-9002-7daf936edc67-kube-api-access-hlwln\") pod \"nova-metadata-0\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.681457 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.691044 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.692554 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.694602 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.706858 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.746185 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f4202b-6558-4fe3-8fcc-732aa1a88e60-config-data\") pod \"nova-scheduler-0\" (UID: \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.746245 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9448\" (UniqueName: \"kubernetes.io/projected/48f4202b-6558-4fe3-8fcc-732aa1a88e60-kube-api-access-d9448\") pod \"nova-scheduler-0\" (UID: \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.746450 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f4202b-6558-4fe3-8fcc-732aa1a88e60-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.786612 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.849493 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f4202b-6558-4fe3-8fcc-732aa1a88e60-config-data\") pod \"nova-scheduler-0\" (UID: \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.849863 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9448\" (UniqueName: \"kubernetes.io/projected/48f4202b-6558-4fe3-8fcc-732aa1a88e60-kube-api-access-d9448\") pod \"nova-scheduler-0\" (UID: \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.850073 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f4202b-6558-4fe3-8fcc-732aa1a88e60-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.854771 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f4202b-6558-4fe3-8fcc-732aa1a88e60-config-data\") pod \"nova-scheduler-0\" (UID: \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.856834 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f4202b-6558-4fe3-8fcc-732aa1a88e60-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:51 crc kubenswrapper[4991]: I1006 08:41:51.941671 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9448\" (UniqueName: \"kubernetes.io/projected/48f4202b-6558-4fe3-8fcc-732aa1a88e60-kube-api-access-d9448\") pod \"nova-scheduler-0\" (UID: \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\") " pod="openstack/nova-scheduler-0" Oct 06 08:41:52 crc kubenswrapper[4991]: I1006 08:41:52.091453 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:41:52 crc kubenswrapper[4991]: I1006 08:41:52.289769 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:41:52 crc kubenswrapper[4991]: W1006 08:41:52.294558 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70e2b1c5_03aa_4472_9002_7daf936edc67.slice/crio-a48db5685f7377e9be4ea3f517179ff3b19dee2b7baad29e35463d0a8cdeb6c6 WatchSource:0}: Error finding container a48db5685f7377e9be4ea3f517179ff3b19dee2b7baad29e35463d0a8cdeb6c6: Status 404 returned error can't find the container with id a48db5685f7377e9be4ea3f517179ff3b19dee2b7baad29e35463d0a8cdeb6c6 Oct 06 08:41:52 crc kubenswrapper[4991]: I1006 08:41:52.347455 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70e2b1c5-03aa-4472-9002-7daf936edc67","Type":"ContainerStarted","Data":"a48db5685f7377e9be4ea3f517179ff3b19dee2b7baad29e35463d0a8cdeb6c6"} Oct 06 08:41:52 crc kubenswrapper[4991]: I1006 08:41:52.567038 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:41:52 crc kubenswrapper[4991]: W1006 08:41:52.592164 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48f4202b_6558_4fe3_8fcc_732aa1a88e60.slice/crio-70e67b9801c80b2624076c47d8258dd45b7659f413ca1ca9cb329dd5fe8c2fec WatchSource:0}: Error finding container 70e67b9801c80b2624076c47d8258dd45b7659f413ca1ca9cb329dd5fe8c2fec: Status 404 returned error can't find the container with id 70e67b9801c80b2624076c47d8258dd45b7659f413ca1ca9cb329dd5fe8c2fec Oct 06 08:41:53 crc kubenswrapper[4991]: I1006 08:41:53.262960 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="704c9be7-2e65-4018-9388-7f75e8f4dcd6" path="/var/lib/kubelet/pods/704c9be7-2e65-4018-9388-7f75e8f4dcd6/volumes" Oct 06 08:41:53 crc kubenswrapper[4991]: I1006 08:41:53.264446 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f59c5f21-de13-43af-94e4-a2fc82169e33" path="/var/lib/kubelet/pods/f59c5f21-de13-43af-94e4-a2fc82169e33/volumes" Oct 06 08:41:53 crc kubenswrapper[4991]: I1006 08:41:53.361240 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70e2b1c5-03aa-4472-9002-7daf936edc67","Type":"ContainerStarted","Data":"6a0dd291d385b5c827db71bd9cfc93863a8619475a3e50f8d7d1d405394842f9"} Oct 06 08:41:53 crc kubenswrapper[4991]: I1006 08:41:53.361332 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70e2b1c5-03aa-4472-9002-7daf936edc67","Type":"ContainerStarted","Data":"4c49ecde03a108088eaff49d978ace50e6654f0b6205db59fddf267b0df1faab"} Oct 06 08:41:53 crc kubenswrapper[4991]: I1006 08:41:53.364335 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"48f4202b-6558-4fe3-8fcc-732aa1a88e60","Type":"ContainerStarted","Data":"4f4325397287518c3ecb285a52c75cc737cf34c7fece8ee912a41c376bf55696"} Oct 06 08:41:53 crc kubenswrapper[4991]: I1006 08:41:53.364384 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"48f4202b-6558-4fe3-8fcc-732aa1a88e60","Type":"ContainerStarted","Data":"70e67b9801c80b2624076c47d8258dd45b7659f413ca1ca9cb329dd5fe8c2fec"} Oct 06 08:41:53 crc kubenswrapper[4991]: I1006 08:41:53.395895 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.395876551 podStartE2EDuration="2.395876551s" podCreationTimestamp="2025-10-06 08:41:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:53.382628637 +0000 UTC m=+1365.120378698" watchObservedRunningTime="2025-10-06 08:41:53.395876551 +0000 UTC m=+1365.133626572" Oct 06 08:41:53 crc kubenswrapper[4991]: I1006 08:41:53.406875 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.406856909 podStartE2EDuration="2.406856909s" podCreationTimestamp="2025-10-06 08:41:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:41:53.400632469 +0000 UTC m=+1365.138382500" watchObservedRunningTime="2025-10-06 08:41:53.406856909 +0000 UTC m=+1365.144606930" Oct 06 08:41:56 crc kubenswrapper[4991]: I1006 08:41:56.787147 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 08:41:56 crc kubenswrapper[4991]: I1006 08:41:56.787908 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 08:41:57 crc kubenswrapper[4991]: I1006 08:41:57.092902 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 08:41:57 crc kubenswrapper[4991]: E1006 08:41:57.735117 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc71fc75d_0a11_4673_a14d_90f3269ff26f.slice/crio-cfe21ba219347645960b422708293d79554a8bd271b712cf6fad6a029f3aa597\": RecentStats: unable to find data in memory cache]" Oct 06 08:41:58 crc kubenswrapper[4991]: I1006 08:41:58.679535 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 08:41:58 crc kubenswrapper[4991]: I1006 08:41:58.679900 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 08:41:59 crc kubenswrapper[4991]: I1006 08:41:59.697596 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="23e696d7-7767-4a92-9828-a189ffb52275" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 08:41:59 crc kubenswrapper[4991]: I1006 08:41:59.697596 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="23e696d7-7767-4a92-9828-a189ffb52275" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 08:42:01 crc kubenswrapper[4991]: I1006 08:42:01.787466 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 08:42:01 crc kubenswrapper[4991]: I1006 08:42:01.787814 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 08:42:02 crc kubenswrapper[4991]: I1006 08:42:02.092792 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 08:42:02 crc kubenswrapper[4991]: I1006 08:42:02.135980 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 08:42:02 crc kubenswrapper[4991]: I1006 08:42:02.503966 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 08:42:02 crc kubenswrapper[4991]: I1006 08:42:02.801491 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 08:42:02 crc kubenswrapper[4991]: I1006 08:42:02.801578 4991 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 08:42:07 crc kubenswrapper[4991]: I1006 08:42:07.782647 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 08:42:08 crc kubenswrapper[4991]: E1006 08:42:08.088471 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc71fc75d_0a11_4673_a14d_90f3269ff26f.slice/crio-cfe21ba219347645960b422708293d79554a8bd271b712cf6fad6a029f3aa597\": RecentStats: unable to find data in memory cache]" Oct 06 08:42:08 crc kubenswrapper[4991]: I1006 08:42:08.689635 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 08:42:08 crc kubenswrapper[4991]: I1006 08:42:08.689999 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 08:42:08 crc kubenswrapper[4991]: I1006 08:42:08.691339 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 08:42:08 crc kubenswrapper[4991]: I1006 08:42:08.705169 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 08:42:09 crc kubenswrapper[4991]: I1006 08:42:09.559982 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 08:42:09 crc kubenswrapper[4991]: I1006 08:42:09.572522 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 08:42:11 crc kubenswrapper[4991]: I1006 08:42:11.791844 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 08:42:11 crc kubenswrapper[4991]: I1006 08:42:11.798100 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 08:42:11 crc kubenswrapper[4991]: I1006 08:42:11.801650 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 08:42:12 crc kubenswrapper[4991]: I1006 08:42:12.598247 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.588575 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xggwb"] Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.593165 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.607600 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srdfc\" (UniqueName: \"kubernetes.io/projected/b81f214f-994e-4a29-9171-c9860bd89fb8-kube-api-access-srdfc\") pod \"certified-operators-xggwb\" (UID: \"b81f214f-994e-4a29-9171-c9860bd89fb8\") " pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.607812 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81f214f-994e-4a29-9171-c9860bd89fb8-utilities\") pod \"certified-operators-xggwb\" (UID: \"b81f214f-994e-4a29-9171-c9860bd89fb8\") " pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.607942 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81f214f-994e-4a29-9171-c9860bd89fb8-catalog-content\") pod \"certified-operators-xggwb\" (UID: \"b81f214f-994e-4a29-9171-c9860bd89fb8\") " pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.627828 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xggwb"] Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.709696 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81f214f-994e-4a29-9171-c9860bd89fb8-catalog-content\") pod \"certified-operators-xggwb\" (UID: \"b81f214f-994e-4a29-9171-c9860bd89fb8\") " pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.710156 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srdfc\" (UniqueName: \"kubernetes.io/projected/b81f214f-994e-4a29-9171-c9860bd89fb8-kube-api-access-srdfc\") pod \"certified-operators-xggwb\" (UID: \"b81f214f-994e-4a29-9171-c9860bd89fb8\") " pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.710238 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81f214f-994e-4a29-9171-c9860bd89fb8-utilities\") pod \"certified-operators-xggwb\" (UID: \"b81f214f-994e-4a29-9171-c9860bd89fb8\") " pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.710791 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81f214f-994e-4a29-9171-c9860bd89fb8-catalog-content\") pod \"certified-operators-xggwb\" (UID: \"b81f214f-994e-4a29-9171-c9860bd89fb8\") " pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.710835 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81f214f-994e-4a29-9171-c9860bd89fb8-utilities\") pod \"certified-operators-xggwb\" (UID: \"b81f214f-994e-4a29-9171-c9860bd89fb8\") " pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.742049 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srdfc\" (UniqueName: \"kubernetes.io/projected/b81f214f-994e-4a29-9171-c9860bd89fb8-kube-api-access-srdfc\") pod \"certified-operators-xggwb\" (UID: \"b81f214f-994e-4a29-9171-c9860bd89fb8\") " pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:19 crc kubenswrapper[4991]: I1006 08:42:19.954876 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:20 crc kubenswrapper[4991]: I1006 08:42:20.420551 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xggwb"] Oct 06 08:42:20 crc kubenswrapper[4991]: I1006 08:42:20.689776 4991 generic.go:334] "Generic (PLEG): container finished" podID="b81f214f-994e-4a29-9171-c9860bd89fb8" containerID="0265d77d7c4030621ce5c0712d4e0a9661df13293d4a21e767e727989d482e3c" exitCode=0 Oct 06 08:42:20 crc kubenswrapper[4991]: I1006 08:42:20.689828 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xggwb" event={"ID":"b81f214f-994e-4a29-9171-c9860bd89fb8","Type":"ContainerDied","Data":"0265d77d7c4030621ce5c0712d4e0a9661df13293d4a21e767e727989d482e3c"} Oct 06 08:42:20 crc kubenswrapper[4991]: I1006 08:42:20.690141 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xggwb" event={"ID":"b81f214f-994e-4a29-9171-c9860bd89fb8","Type":"ContainerStarted","Data":"2bb43381a4b23d80650eb1fe03b9be63ebbec3e626ccee240303c92e450f1454"} Oct 06 08:42:20 crc kubenswrapper[4991]: I1006 08:42:20.691846 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:42:21 crc kubenswrapper[4991]: I1006 08:42:21.705370 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xggwb" event={"ID":"b81f214f-994e-4a29-9171-c9860bd89fb8","Type":"ContainerStarted","Data":"14eda0f32c9ac331640030ae2f18dcaf237c4f64718981e2b15b240d359715e5"} Oct 06 08:42:22 crc kubenswrapper[4991]: I1006 08:42:22.724175 4991 generic.go:334] "Generic (PLEG): container finished" podID="b81f214f-994e-4a29-9171-c9860bd89fb8" containerID="14eda0f32c9ac331640030ae2f18dcaf237c4f64718981e2b15b240d359715e5" exitCode=0 Oct 06 08:42:22 crc kubenswrapper[4991]: I1006 08:42:22.724244 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xggwb" event={"ID":"b81f214f-994e-4a29-9171-c9860bd89fb8","Type":"ContainerDied","Data":"14eda0f32c9ac331640030ae2f18dcaf237c4f64718981e2b15b240d359715e5"} Oct 06 08:42:23 crc kubenswrapper[4991]: I1006 08:42:23.737216 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xggwb" event={"ID":"b81f214f-994e-4a29-9171-c9860bd89fb8","Type":"ContainerStarted","Data":"534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d"} Oct 06 08:42:23 crc kubenswrapper[4991]: I1006 08:42:23.762154 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xggwb" podStartSLOduration=2.301482292 podStartE2EDuration="4.762139439s" podCreationTimestamp="2025-10-06 08:42:19 +0000 UTC" firstStartedPulling="2025-10-06 08:42:20.69159536 +0000 UTC m=+1392.429345371" lastFinishedPulling="2025-10-06 08:42:23.152252457 +0000 UTC m=+1394.890002518" observedRunningTime="2025-10-06 08:42:23.755386713 +0000 UTC m=+1395.493136734" watchObservedRunningTime="2025-10-06 08:42:23.762139439 +0000 UTC m=+1395.499889460" Oct 06 08:42:29 crc kubenswrapper[4991]: I1006 08:42:29.955328 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:29 crc kubenswrapper[4991]: I1006 08:42:29.956103 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:30 crc kubenswrapper[4991]: I1006 08:42:30.038903 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:30 crc kubenswrapper[4991]: I1006 08:42:30.894438 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:30 crc kubenswrapper[4991]: I1006 08:42:30.963421 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xggwb"] Oct 06 08:42:32 crc kubenswrapper[4991]: I1006 08:42:32.697764 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8bc2q"] Oct 06 08:42:32 crc kubenswrapper[4991]: I1006 08:42:32.703051 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:32 crc kubenswrapper[4991]: I1006 08:42:32.724511 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bc2q"] Oct 06 08:42:32 crc kubenswrapper[4991]: I1006 08:42:32.836143 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xggwb" podUID="b81f214f-994e-4a29-9171-c9860bd89fb8" containerName="registry-server" containerID="cri-o://534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d" gracePeriod=2 Oct 06 08:42:32 crc kubenswrapper[4991]: I1006 08:42:32.873588 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhc2r\" (UniqueName: \"kubernetes.io/projected/544d772e-ee45-4bd6-9895-07dec1dc3ff1-kube-api-access-vhc2r\") pod \"redhat-marketplace-8bc2q\" (UID: \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\") " pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:32 crc kubenswrapper[4991]: I1006 08:42:32.873740 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/544d772e-ee45-4bd6-9895-07dec1dc3ff1-utilities\") pod \"redhat-marketplace-8bc2q\" (UID: \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\") " pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:32 crc kubenswrapper[4991]: I1006 08:42:32.873803 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/544d772e-ee45-4bd6-9895-07dec1dc3ff1-catalog-content\") pod \"redhat-marketplace-8bc2q\" (UID: \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\") " pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:32 crc kubenswrapper[4991]: I1006 08:42:32.975353 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/544d772e-ee45-4bd6-9895-07dec1dc3ff1-catalog-content\") pod \"redhat-marketplace-8bc2q\" (UID: \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\") " pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:32 crc kubenswrapper[4991]: I1006 08:42:32.975516 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhc2r\" (UniqueName: \"kubernetes.io/projected/544d772e-ee45-4bd6-9895-07dec1dc3ff1-kube-api-access-vhc2r\") pod \"redhat-marketplace-8bc2q\" (UID: \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\") " pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:32 crc kubenswrapper[4991]: I1006 08:42:32.975611 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/544d772e-ee45-4bd6-9895-07dec1dc3ff1-utilities\") pod \"redhat-marketplace-8bc2q\" (UID: \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\") " pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:32 crc kubenswrapper[4991]: I1006 08:42:32.976081 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/544d772e-ee45-4bd6-9895-07dec1dc3ff1-catalog-content\") pod \"redhat-marketplace-8bc2q\" (UID: \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\") " pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:32 crc kubenswrapper[4991]: I1006 08:42:32.976147 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/544d772e-ee45-4bd6-9895-07dec1dc3ff1-utilities\") pod \"redhat-marketplace-8bc2q\" (UID: \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\") " pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.008326 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhc2r\" (UniqueName: \"kubernetes.io/projected/544d772e-ee45-4bd6-9895-07dec1dc3ff1-kube-api-access-vhc2r\") pod \"redhat-marketplace-8bc2q\" (UID: \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\") " pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.037778 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.504409 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.641801 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance15db-account-delete-rpvcb"] Oct 06 08:42:33 crc kubenswrapper[4991]: E1006 08:42:33.642203 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81f214f-994e-4a29-9171-c9860bd89fb8" containerName="extract-content" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.642219 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81f214f-994e-4a29-9171-c9860bd89fb8" containerName="extract-content" Oct 06 08:42:33 crc kubenswrapper[4991]: E1006 08:42:33.642236 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81f214f-994e-4a29-9171-c9860bd89fb8" containerName="extract-utilities" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.642242 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81f214f-994e-4a29-9171-c9860bd89fb8" containerName="extract-utilities" Oct 06 08:42:33 crc kubenswrapper[4991]: E1006 08:42:33.642259 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81f214f-994e-4a29-9171-c9860bd89fb8" containerName="registry-server" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.642265 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81f214f-994e-4a29-9171-c9860bd89fb8" containerName="registry-server" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.656177 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b81f214f-994e-4a29-9171-c9860bd89fb8" containerName="registry-server" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.656823 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.657008 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="e8e91b06-a3c1-41dc-b2f8-af738647ade8" containerName="openstackclient" containerID="cri-o://81c13c33c57ac2a7fafdd527f3ce9a1ddf23d76bafb0a388ddb5af43c282a8b4" gracePeriod=2 Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.657230 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance15db-account-delete-rpvcb" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.679642 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.691736 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81f214f-994e-4a29-9171-c9860bd89fb8-utilities\") pod \"b81f214f-994e-4a29-9171-c9860bd89fb8\" (UID: \"b81f214f-994e-4a29-9171-c9860bd89fb8\") " Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.691818 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srdfc\" (UniqueName: \"kubernetes.io/projected/b81f214f-994e-4a29-9171-c9860bd89fb8-kube-api-access-srdfc\") pod \"b81f214f-994e-4a29-9171-c9860bd89fb8\" (UID: \"b81f214f-994e-4a29-9171-c9860bd89fb8\") " Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.691946 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81f214f-994e-4a29-9171-c9860bd89fb8-catalog-content\") pod \"b81f214f-994e-4a29-9171-c9860bd89fb8\" (UID: \"b81f214f-994e-4a29-9171-c9860bd89fb8\") " Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.692508 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81f214f-994e-4a29-9171-c9860bd89fb8-utilities" (OuterVolumeSpecName: "utilities") pod "b81f214f-994e-4a29-9171-c9860bd89fb8" (UID: "b81f214f-994e-4a29-9171-c9860bd89fb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.701759 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance15db-account-delete-rpvcb"] Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.704514 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b81f214f-994e-4a29-9171-c9860bd89fb8-kube-api-access-srdfc" (OuterVolumeSpecName: "kube-api-access-srdfc") pod "b81f214f-994e-4a29-9171-c9860bd89fb8" (UID: "b81f214f-994e-4a29-9171-c9860bd89fb8"). InnerVolumeSpecName "kube-api-access-srdfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.725445 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementd82d-account-delete-jlr78"] Oct 06 08:42:33 crc kubenswrapper[4991]: E1006 08:42:33.725883 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e91b06-a3c1-41dc-b2f8-af738647ade8" containerName="openstackclient" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.725894 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e91b06-a3c1-41dc-b2f8-af738647ade8" containerName="openstackclient" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.726077 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8e91b06-a3c1-41dc-b2f8-af738647ade8" containerName="openstackclient" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.726683 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementd82d-account-delete-jlr78" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.746936 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementd82d-account-delete-jlr78"] Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.796401 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jd6h\" (UniqueName: \"kubernetes.io/projected/50622552-6b5c-4af5-a457-09c526c54f3f-kube-api-access-5jd6h\") pod \"glance15db-account-delete-rpvcb\" (UID: \"50622552-6b5c-4af5-a457-09c526c54f3f\") " pod="openstack/glance15db-account-delete-rpvcb" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.796528 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b81f214f-994e-4a29-9171-c9860bd89fb8-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.796543 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srdfc\" (UniqueName: \"kubernetes.io/projected/b81f214f-994e-4a29-9171-c9860bd89fb8-kube-api-access-srdfc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.806453 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81f214f-994e-4a29-9171-c9860bd89fb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b81f214f-994e-4a29-9171-c9860bd89fb8" (UID: "b81f214f-994e-4a29-9171-c9860bd89fb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.860718 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutronb842-account-delete-b9lmt"] Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.861884 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronb842-account-delete-b9lmt" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.874419 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.874610 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="51a7066c-5143-43ab-b642-81f461a9c1f4" containerName="ovn-northd" containerID="cri-o://fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865" gracePeriod=30 Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.874691 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="51a7066c-5143-43ab-b642-81f461a9c1f4" containerName="openstack-network-exporter" containerID="cri-o://9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d" gracePeriod=30 Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.897736 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvd6v\" (UniqueName: \"kubernetes.io/projected/87faa73b-1148-48ae-88f4-3bdd06898658-kube-api-access-jvd6v\") pod \"placementd82d-account-delete-jlr78\" (UID: \"87faa73b-1148-48ae-88f4-3bdd06898658\") " pod="openstack/placementd82d-account-delete-jlr78" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.897805 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jd6h\" (UniqueName: \"kubernetes.io/projected/50622552-6b5c-4af5-a457-09c526c54f3f-kube-api-access-5jd6h\") pod \"glance15db-account-delete-rpvcb\" (UID: \"50622552-6b5c-4af5-a457-09c526c54f3f\") " pod="openstack/glance15db-account-delete-rpvcb" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.897894 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b81f214f-994e-4a29-9171-c9860bd89fb8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.909000 4991 generic.go:334] "Generic (PLEG): container finished" podID="b81f214f-994e-4a29-9171-c9860bd89fb8" containerID="534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d" exitCode=0 Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.909205 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xggwb" event={"ID":"b81f214f-994e-4a29-9171-c9860bd89fb8","Type":"ContainerDied","Data":"534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d"} Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.909325 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xggwb" event={"ID":"b81f214f-994e-4a29-9171-c9860bd89fb8","Type":"ContainerDied","Data":"2bb43381a4b23d80650eb1fe03b9be63ebbec3e626ccee240303c92e450f1454"} Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.909437 4991 scope.go:117] "RemoveContainer" containerID="534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.909641 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xggwb" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.935004 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronb842-account-delete-b9lmt"] Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.979789 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jd6h\" (UniqueName: \"kubernetes.io/projected/50622552-6b5c-4af5-a457-09c526c54f3f-kube-api-access-5jd6h\") pod \"glance15db-account-delete-rpvcb\" (UID: \"50622552-6b5c-4af5-a457-09c526c54f3f\") " pod="openstack/glance15db-account-delete-rpvcb" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.997838 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance15db-account-delete-rpvcb" Oct 06 08:42:33 crc kubenswrapper[4991]: I1006 08:42:33.999856 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg7vn\" (UniqueName: \"kubernetes.io/projected/4f1297ce-72cf-4b07-a66d-826e8e9c1663-kube-api-access-zg7vn\") pod \"neutronb842-account-delete-b9lmt\" (UID: \"4f1297ce-72cf-4b07-a66d-826e8e9c1663\") " pod="openstack/neutronb842-account-delete-b9lmt" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.000054 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvd6v\" (UniqueName: \"kubernetes.io/projected/87faa73b-1148-48ae-88f4-3bdd06898658-kube-api-access-jvd6v\") pod \"placementd82d-account-delete-jlr78\" (UID: \"87faa73b-1148-48ae-88f4-3bdd06898658\") " pod="openstack/placementd82d-account-delete-jlr78" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.024361 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.040417 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bc2q"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.051673 4991 scope.go:117] "RemoveContainer" containerID="14eda0f32c9ac331640030ae2f18dcaf237c4f64718981e2b15b240d359715e5" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.075528 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-b5jwb"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.077265 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvd6v\" (UniqueName: \"kubernetes.io/projected/87faa73b-1148-48ae-88f4-3bdd06898658-kube-api-access-jvd6v\") pod \"placementd82d-account-delete-jlr78\" (UID: \"87faa73b-1148-48ae-88f4-3bdd06898658\") " pod="openstack/placementd82d-account-delete-jlr78" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.101878 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-b5jwb"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.122043 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg7vn\" (UniqueName: \"kubernetes.io/projected/4f1297ce-72cf-4b07-a66d-826e8e9c1663-kube-api-access-zg7vn\") pod \"neutronb842-account-delete-b9lmt\" (UID: \"4f1297ce-72cf-4b07-a66d-826e8e9c1663\") " pod="openstack/neutronb842-account-delete-b9lmt" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.248444 4991 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutron-7988dccf5c-j9ll7" secret="" err="secret \"neutron-neutron-dockercfg-5cb9d\" not found" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.250016 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg7vn\" (UniqueName: \"kubernetes.io/projected/4f1297ce-72cf-4b07-a66d-826e8e9c1663-kube-api-access-zg7vn\") pod \"neutronb842-account-delete-b9lmt\" (UID: \"4f1297ce-72cf-4b07-a66d-826e8e9c1663\") " pod="openstack/neutronb842-account-delete-b9lmt" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.272029 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-4mpdq"] Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.304227 4991 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.304287 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data podName:1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:34.804268463 +0000 UTC m=+1406.542018484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data") pod "rabbitmq-cell1-server-0" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1") : configmap "rabbitmq-cell1-config-data" not found Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.310217 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-4mpdq"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.320613 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xggwb"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.345681 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xggwb"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.360709 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementd82d-account-delete-jlr78" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.381813 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-s5hfs"] Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.405420 4991 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.405471 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config podName:0a6703e0-1fac-4734-98ac-88f6163fdaae nodeName:}" failed. No retries permitted until 2025-10-06 08:42:34.90545734 +0000 UTC m=+1406.643207361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config") pod "neutron-7988dccf5c-j9ll7" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae") : secret "neutron-config" not found Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.410087 4991 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.413043 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config podName:0a6703e0-1fac-4734-98ac-88f6163fdaae nodeName:}" failed. No retries permitted until 2025-10-06 08:42:34.912989899 +0000 UTC m=+1406.650739920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config") pod "neutron-7988dccf5c-j9ll7" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae") : secret "neutron-httpd-config" not found Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.417456 4991 scope.go:117] "RemoveContainer" containerID="0265d77d7c4030621ce5c0712d4e0a9661df13293d4a21e767e727989d482e3c" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.418565 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-s5hfs"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.437892 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbicancd31-account-delete-l9h8d"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.440082 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicancd31-account-delete-l9h8d" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.457070 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicancd31-account-delete-l9h8d"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.467603 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.468779 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="1b135498-feb3-4024-b655-92f403f55bb9" containerName="openstack-network-exporter" containerID="cri-o://b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9" gracePeriod=300 Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.473322 4991 scope.go:117] "RemoveContainer" containerID="534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d" Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.496223 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d\": container with ID starting with 534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d not found: ID does not exist" containerID="534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.496330 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d"} err="failed to get container status \"534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d\": rpc error: code = NotFound desc = could not find container \"534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d\": container with ID starting with 534540ce42ec7671ec18b843915ac3a44f7c7c9596b5389e8fb7f28eb1e0d94d not found: ID does not exist" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.496367 4991 scope.go:117] "RemoveContainer" containerID="14eda0f32c9ac331640030ae2f18dcaf237c4f64718981e2b15b240d359715e5" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.497682 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinderb589-account-delete-h8q45"] Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.498630 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14eda0f32c9ac331640030ae2f18dcaf237c4f64718981e2b15b240d359715e5\": container with ID starting with 14eda0f32c9ac331640030ae2f18dcaf237c4f64718981e2b15b240d359715e5 not found: ID does not exist" containerID="14eda0f32c9ac331640030ae2f18dcaf237c4f64718981e2b15b240d359715e5" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.498671 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14eda0f32c9ac331640030ae2f18dcaf237c4f64718981e2b15b240d359715e5"} err="failed to get container status \"14eda0f32c9ac331640030ae2f18dcaf237c4f64718981e2b15b240d359715e5\": rpc error: code = NotFound desc = could not find container \"14eda0f32c9ac331640030ae2f18dcaf237c4f64718981e2b15b240d359715e5\": container with ID starting with 14eda0f32c9ac331640030ae2f18dcaf237c4f64718981e2b15b240d359715e5 not found: ID does not exist" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.499073 4991 scope.go:117] "RemoveContainer" containerID="0265d77d7c4030621ce5c0712d4e0a9661df13293d4a21e767e727989d482e3c" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.499564 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronb842-account-delete-b9lmt" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.499580 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderb589-account-delete-h8q45" Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.499939 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0265d77d7c4030621ce5c0712d4e0a9661df13293d4a21e767e727989d482e3c\": container with ID starting with 0265d77d7c4030621ce5c0712d4e0a9661df13293d4a21e767e727989d482e3c not found: ID does not exist" containerID="0265d77d7c4030621ce5c0712d4e0a9661df13293d4a21e767e727989d482e3c" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.499959 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0265d77d7c4030621ce5c0712d4e0a9661df13293d4a21e767e727989d482e3c"} err="failed to get container status \"0265d77d7c4030621ce5c0712d4e0a9661df13293d4a21e767e727989d482e3c\": rpc error: code = NotFound desc = could not find container \"0265d77d7c4030621ce5c0712d4e0a9661df13293d4a21e767e727989d482e3c\": container with ID starting with 0265d77d7c4030621ce5c0712d4e0a9661df13293d4a21e767e727989d482e3c not found: ID does not exist" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.531241 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinderb589-account-delete-h8q45"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.583726 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.584571 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="6ad6d483-bca3-4391-9e4c-290b6b15b1f4" containerName="openstack-network-exporter" containerID="cri-o://10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1" gracePeriod=300 Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.594783 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-5sltb"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.610422 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6bxl\" (UniqueName: \"kubernetes.io/projected/305f56cb-d896-435c-ae06-4a407714b503-kube-api-access-w6bxl\") pod \"barbicancd31-account-delete-l9h8d\" (UID: \"305f56cb-d896-435c-ae06-4a407714b503\") " pod="openstack/barbicancd31-account-delete-l9h8d" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.610535 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmzzc\" (UniqueName: \"kubernetes.io/projected/7d3b515a-b48d-48f7-8775-a0299e07f231-kube-api-access-hmzzc\") pod \"cinderb589-account-delete-h8q45\" (UID: \"7d3b515a-b48d-48f7-8775-a0299e07f231\") " pod="openstack/cinderb589-account-delete-h8q45" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.616252 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-5sltb"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.651844 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-52tpz"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.669109 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-52tpz"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.709453 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-6qdfr"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.709983 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" podUID="2d06311c-e246-4d3d-ba9c-388cb800ac4f" containerName="dnsmasq-dns" containerID="cri-o://d8899a5ea677f40567793637ebd89b430187e401ecc0a4df4ac6944a237de212" gracePeriod=10 Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.713207 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6bxl\" (UniqueName: \"kubernetes.io/projected/305f56cb-d896-435c-ae06-4a407714b503-kube-api-access-w6bxl\") pod \"barbicancd31-account-delete-l9h8d\" (UID: \"305f56cb-d896-435c-ae06-4a407714b503\") " pod="openstack/barbicancd31-account-delete-l9h8d" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.713288 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmzzc\" (UniqueName: \"kubernetes.io/projected/7d3b515a-b48d-48f7-8775-a0299e07f231-kube-api-access-hmzzc\") pod \"cinderb589-account-delete-h8q45\" (UID: \"7d3b515a-b48d-48f7-8775-a0299e07f231\") " pod="openstack/cinderb589-account-delete-h8q45" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.742611 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="1b135498-feb3-4024-b655-92f403f55bb9" containerName="ovsdbserver-sb" containerID="cri-o://36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b" gracePeriod=300 Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.756659 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6bxl\" (UniqueName: \"kubernetes.io/projected/305f56cb-d896-435c-ae06-4a407714b503-kube-api-access-w6bxl\") pod \"barbicancd31-account-delete-l9h8d\" (UID: \"305f56cb-d896-435c-ae06-4a407714b503\") " pod="openstack/barbicancd31-account-delete-l9h8d" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.759768 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmzzc\" (UniqueName: \"kubernetes.io/projected/7d3b515a-b48d-48f7-8775-a0299e07f231-kube-api-access-hmzzc\") pod \"cinderb589-account-delete-h8q45\" (UID: \"7d3b515a-b48d-48f7-8775-a0299e07f231\") " pod="openstack/cinderb589-account-delete-h8q45" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.763377 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi9279-account-delete-bsk7x"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.773801 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi9279-account-delete-bsk7x" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.780427 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi9279-account-delete-bsk7x"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.791381 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell1ea2a-account-delete-5smxw"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.792614 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1ea2a-account-delete-5smxw" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.807595 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="6ad6d483-bca3-4391-9e4c-290b6b15b1f4" containerName="ovsdbserver-nb" containerID="cri-o://d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e" gracePeriod=300 Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.818591 4991 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.818651 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data podName:1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:35.818638333 +0000 UTC m=+1407.556388354 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data") pod "rabbitmq-cell1-server-0" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1") : configmap "rabbitmq-cell1-config-data" not found Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.833573 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell1ea2a-account-delete-5smxw"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.833957 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicancd31-account-delete-l9h8d" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.860822 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.861136 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d1a24973-6ef6-4732-9a96-040ce646a707" containerName="glance-log" containerID="cri-o://2f29341e126502f19b2fe665eb6f63634e44f634ecd075749718883b8f004d5b" gracePeriod=30 Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.861616 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d1a24973-6ef6-4732-9a96-040ce646a707" containerName="glance-httpd" containerID="cri-o://4fbfc2abb485c8ccd9560493a1360ee31985544c8877ac7b1baa4f76139308c7" gracePeriod=30 Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.863458 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinderb589-account-delete-h8q45" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.876832 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b98fcbb5b-2m256"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.877117 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b98fcbb5b-2m256" podUID="feb6a9a7-403e-4dc9-903c-349391d84efb" containerName="placement-log" containerID="cri-o://40cc2581ab3ca423c98e61d01fbf933e125eced21752dcff956d71eaf1890135" gracePeriod=30 Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.877382 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b98fcbb5b-2m256" podUID="feb6a9a7-403e-4dc9-903c-349391d84efb" containerName="placement-api" containerID="cri-o://fb10a7813dbd9816182db52dd9a3b501c4c766c110a2193adb6f6007214cdc4f" gracePeriod=30 Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.906728 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.921508 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pmvv\" (UniqueName: \"kubernetes.io/projected/f2791937-a79f-4d99-b895-6d3ac79ba220-kube-api-access-8pmvv\") pod \"novacell1ea2a-account-delete-5smxw\" (UID: \"f2791937-a79f-4d99-b895-6d3ac79ba220\") " pod="openstack/novacell1ea2a-account-delete-5smxw" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.921550 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmvb7\" (UniqueName: \"kubernetes.io/projected/2b01de4c-42f4-4928-916a-6a9638340718-kube-api-access-fmvb7\") pod \"novaapi9279-account-delete-bsk7x\" (UID: \"2b01de4c-42f4-4928-916a-6a9638340718\") " pod="openstack/novaapi9279-account-delete-bsk7x" Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.921820 4991 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.921869 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config podName:0a6703e0-1fac-4734-98ac-88f6163fdaae nodeName:}" failed. No retries permitted until 2025-10-06 08:42:35.921852249 +0000 UTC m=+1407.659602270 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config") pod "neutron-7988dccf5c-j9ll7" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae") : secret "neutron-httpd-config" not found Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.922171 4991 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Oct 06 08:42:34 crc kubenswrapper[4991]: E1006 08:42:34.922194 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config podName:0a6703e0-1fac-4734-98ac-88f6163fdaae nodeName:}" failed. No retries permitted until 2025-10-06 08:42:35.922187518 +0000 UTC m=+1407.659937539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config") pod "neutron-7988dccf5c-j9ll7" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae") : secret "neutron-config" not found Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.968886 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1b135498-feb3-4024-b655-92f403f55bb9/ovsdbserver-sb/0.log" Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.968921 4991 generic.go:334] "Generic (PLEG): container finished" podID="1b135498-feb3-4024-b655-92f403f55bb9" containerID="b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9" exitCode=2 Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.968965 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1b135498-feb3-4024-b655-92f403f55bb9","Type":"ContainerDied","Data":"b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9"} Oct 06 08:42:34 crc kubenswrapper[4991]: I1006 08:42:34.970399 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-5prwt"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:34.993249 4991 generic.go:334] "Generic (PLEG): container finished" podID="544d772e-ee45-4bd6-9895-07dec1dc3ff1" containerID="38fc13c77e37c9f1e00a7d2ea790418d81fb0aba7b509e48bc1695d437e50dda" exitCode=0 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:34.993531 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bc2q" event={"ID":"544d772e-ee45-4bd6-9895-07dec1dc3ff1","Type":"ContainerDied","Data":"38fc13c77e37c9f1e00a7d2ea790418d81fb0aba7b509e48bc1695d437e50dda"} Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:34.993555 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bc2q" event={"ID":"544d772e-ee45-4bd6-9895-07dec1dc3ff1","Type":"ContainerStarted","Data":"4534c655d15de1ec2556c633c215e41448bf05481ce559ec7dd1bfb799d8efa7"} Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.002746 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-df86g"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.002965 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-df86g" podUID="0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6" containerName="openstack-network-exporter" containerID="cri-o://ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.035957 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pmvv\" (UniqueName: \"kubernetes.io/projected/f2791937-a79f-4d99-b895-6d3ac79ba220-kube-api-access-8pmvv\") pod \"novacell1ea2a-account-delete-5smxw\" (UID: \"f2791937-a79f-4d99-b895-6d3ac79ba220\") " pod="openstack/novacell1ea2a-account-delete-5smxw" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.036026 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmvb7\" (UniqueName: \"kubernetes.io/projected/2b01de4c-42f4-4928-916a-6a9638340718-kube-api-access-fmvb7\") pod \"novaapi9279-account-delete-bsk7x\" (UID: \"2b01de4c-42f4-4928-916a-6a9638340718\") " pod="openstack/novaapi9279-account-delete-bsk7x" Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.037129 4991 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.037209 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data podName:53c6aca4-4fd0-4d42-bbe2-4b6e91643503 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:35.537191146 +0000 UTC m=+1407.274941167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data") pod "rabbitmq-server-0" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503") : configmap "rabbitmq-config-data" not found Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.049633 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jklxx"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.063098 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7988dccf5c-j9ll7"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.071723 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6ad6d483-bca3-4391-9e4c-290b6b15b1f4/ovsdbserver-nb/0.log" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.071808 4991 generic.go:334] "Generic (PLEG): container finished" podID="6ad6d483-bca3-4391-9e4c-290b6b15b1f4" containerID="10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1" exitCode=2 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.071898 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6ad6d483-bca3-4391-9e4c-290b6b15b1f4","Type":"ContainerDied","Data":"10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1"} Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.087965 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.088318 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="aa57b1fb-c743-4137-9501-a0110f385b1c" containerName="glance-log" containerID="cri-o://f83ef24bc48b9f3df4545258f80864d16d673fc45ac88575e0e485addba7df62" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.088472 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="aa57b1fb-c743-4137-9501-a0110f385b1c" containerName="glance-httpd" containerID="cri-o://9f7dc8083673fb521af061c8df5ca04354332444376a940728267b2a54832c2d" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.089487 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pmvv\" (UniqueName: \"kubernetes.io/projected/f2791937-a79f-4d99-b895-6d3ac79ba220-kube-api-access-8pmvv\") pod \"novacell1ea2a-account-delete-5smxw\" (UID: \"f2791937-a79f-4d99-b895-6d3ac79ba220\") " pod="openstack/novacell1ea2a-account-delete-5smxw" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.091571 4991 generic.go:334] "Generic (PLEG): container finished" podID="2d06311c-e246-4d3d-ba9c-388cb800ac4f" containerID="d8899a5ea677f40567793637ebd89b430187e401ecc0a4df4ac6944a237de212" exitCode=0 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.091709 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" event={"ID":"2d06311c-e246-4d3d-ba9c-388cb800ac4f","Type":"ContainerDied","Data":"d8899a5ea677f40567793637ebd89b430187e401ecc0a4df4ac6944a237de212"} Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.093791 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmvb7\" (UniqueName: \"kubernetes.io/projected/2b01de4c-42f4-4928-916a-6a9638340718-kube-api-access-fmvb7\") pod \"novaapi9279-account-delete-bsk7x\" (UID: \"2b01de4c-42f4-4928-916a-6a9638340718\") " pod="openstack/novaapi9279-account-delete-bsk7x" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.103698 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lglj7"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.135255 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lglj7"] Oct 06 08:42:35 crc kubenswrapper[4991]: W1006 08:42:35.223657 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50622552_6b5c_4af5_a457_09c526c54f3f.slice/crio-4b1c4cb63acfc187b27cb2a3dfa633dfdc736b06e498d21d81a3c0e9609fb340 WatchSource:0}: Error finding container 4b1c4cb63acfc187b27cb2a3dfa633dfdc736b06e498d21d81a3c0e9609fb340: Status 404 returned error can't find the container with id 4b1c4cb63acfc187b27cb2a3dfa633dfdc736b06e498d21d81a3c0e9609fb340 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.229625 4991 generic.go:334] "Generic (PLEG): container finished" podID="51a7066c-5143-43ab-b642-81f461a9c1f4" containerID="9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d" exitCode=2 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.230000 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7988dccf5c-j9ll7" podUID="0a6703e0-1fac-4734-98ac-88f6163fdaae" containerName="neutron-api" containerID="cri-o://d0aac78aa43c86da1a2d4708b970a7fa2c38a878adf032b4bc160cf815163a9d" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.230191 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"51a7066c-5143-43ab-b642-81f461a9c1f4","Type":"ContainerDied","Data":"9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d"} Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.230411 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7988dccf5c-j9ll7" podUID="0a6703e0-1fac-4734-98ac-88f6163fdaae" containerName="neutron-httpd" containerID="cri-o://93e5b235f20e302b6749df9897200518a9608b53c7db75afd7a755bd7c31a9e2" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.259399 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi9279-account-delete-bsk7x" Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.306590 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.308458 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.323094 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.323173 4991 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="51a7066c-5143-43ab-b642-81f461a9c1f4" containerName="ovn-northd" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.336245 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1ea2a-account-delete-5smxw" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.412459 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae62f13-d5be-414e-a6f9-9b2e475afbd1" path="/var/lib/kubelet/pods/5ae62f13-d5be-414e-a6f9-9b2e475afbd1/volumes" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.413353 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5" path="/var/lib/kubelet/pods/85987e42-3d5a-45e3-af5a-f1dd6b1bcfc5/volumes" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.414276 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad2fab6-f115-4ab3-b631-242ef3474da2" path="/var/lib/kubelet/pods/8ad2fab6-f115-4ab3-b631-242ef3474da2/volumes" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.415919 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3fead61-55ea-433e-afe9-983c17ef5cdf" path="/var/lib/kubelet/pods/a3fead61-55ea-433e-afe9-983c17ef5cdf/volumes" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.420404 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b81f214f-994e-4a29-9171-c9860bd89fb8" path="/var/lib/kubelet/pods/b81f214f-994e-4a29-9171-c9860bd89fb8/volumes" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.421629 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c473952b-d738-4c47-a5e2-c6f827ff4730" path="/var/lib/kubelet/pods/c473952b-d738-4c47-a5e2-c6f827ff4730/volumes" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.422412 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d139f7e8-c126-43bf-9a26-7692b455412b" path="/var/lib/kubelet/pods/d139f7e8-c126-43bf-9a26-7692b455412b/volumes" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.423449 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.423531 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-15db-account-create-j2zlh"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.423600 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance15db-account-delete-rpvcb"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.423660 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-15db-account-create-j2zlh"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.423716 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gc8kg"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.423768 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-gc8kg"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.424239 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-server" containerID="cri-o://ed12c4a932f30894215eff330feb00b02897cadb829ca357ed1fd45e5afdf1b3" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.424467 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-server" containerID="cri-o://d9388ecf0c6db1afc9baa8762ef9460101639492f4059916a5452baf6ce1da9b" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.424516 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-updater" containerID="cri-o://aebf96364238cb6b3d252db6049f87fc6c27dc0650a174ecda7b2742358b2979" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.425101 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-auditor" containerID="cri-o://cc510399cff86b9534906da4fd4dfb566ffc21c65dc3e7a29de4d1e16e9e7f7a" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.425230 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="swift-recon-cron" containerID="cri-o://eb28a1e65b323917d5e53d7d3619b4b0894ce6380fa661067a656f0faf1a3966" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.425498 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-replicator" containerID="cri-o://abaa2e04344e35bc84fdbd617310659cf3403a7924fe6ea867f216abcc6fa8c7" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.425567 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-server" containerID="cri-o://18a56e04769a024151f561f4820a607601164263d72a0ba3ba3c5a8eb7b72631" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.425326 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-updater" containerID="cri-o://264ceea5be73f445fe8809bba7e4a58faeb85d87ce005ae2e2337c4fbd772807" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.425336 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-auditor" containerID="cri-o://8619a7be0d8b8d3e157358434fab68c5d39a5c107bae0e507da39b55321787f9" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.425384 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-replicator" containerID="cri-o://fac75ff26b47c3f0e62bea6d62aa82cb9e5265892c9bea171fe5b4d799545d4b" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.425481 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-replicator" containerID="cri-o://c9ef1fa176e4762e4800cf8c17d38583018327434b1f427f17c6368143ce1443" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.425470 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-reaper" containerID="cri-o://3b537ff709c1788e201f7be5c9872d032b3f628ae4187cee84bd9ddc9645c96c" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.425491 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-auditor" containerID="cri-o://cacc49468ee93ceabe894ccc8d50085a9655611b6c4501bf305bb67771d140e5" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.425318 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-expirer" containerID="cri-o://662006c1a00d0cac716c8677f83ad79a7b88245c89d3c05d4a41987440c0babd" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.425307 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="rsync" containerID="cri-o://25950ee93c182d2a8f2b482674bcf125f0ce2007882775e9431029f2d5153184" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.444426 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance15db-account-delete-rpvcb"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.484286 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5prnr"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.498942 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5prnr"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.511926 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-fswkr"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.534462 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-fswkr"] Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.581248 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b is running failed: container process not found" containerID="36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.581371 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.581556 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4dd2d34c-a29e-47b8-98b4-f75fffb11673" containerName="cinder-scheduler" containerID="cri-o://0d1610527cf8b6f50326a4d6ebe66a1e52c2dc1024a98e810b007cc8199eb0b7" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.581852 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4dd2d34c-a29e-47b8-98b4-f75fffb11673" containerName="probe" containerID="cri-o://d649d548626a4bd3bff872429af0bef8f3a02f2808a38286a2013d34229a5407" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.582073 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b is running failed: container process not found" containerID="36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.583776 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b is running failed: container process not found" containerID="36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.583804 4991 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="1b135498-feb3-4024-b655-92f403f55bb9" containerName="ovsdbserver-sb" Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.610160 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-4bh74"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.622952 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d82d-account-create-27rsm"] Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.625100 4991 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.625171 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data podName:53c6aca4-4fd0-4d42-bbe2-4b6e91643503 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:36.625155581 +0000 UTC m=+1408.362905592 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data") pod "rabbitmq-server-0" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503") : configmap "rabbitmq-config-data" not found Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.639105 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-4bh74"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.657718 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d82d-account-create-27rsm"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.671641 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementd82d-account-delete-jlr78"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.687284 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementd82d-account-delete-jlr78"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.707796 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.708119 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="815c282e-cc40-4ff8-b3f8-155d9a91a20b" containerName="cinder-api-log" containerID="cri-o://47090eb349924642f543c04a66d3390950a21742c182060091cfbb40d99efe76" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.710806 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="815c282e-cc40-4ff8-b3f8-155d9a91a20b" containerName="cinder-api" containerID="cri-o://4e08aae5f1f3064fd06a75855d7641f5f9a9574da5cd200704d0371193acd2b3" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.729851 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-92hrh"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.739370 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-92hrh"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.759041 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b842-account-create-cf7lf"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.781333 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronb842-account-delete-b9lmt"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.796151 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b842-account-create-cf7lf"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.808344 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-9t6rn"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.816705 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-9t6rn"] Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.828795 4991 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.828853 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data podName:1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:37.828839562 +0000 UTC m=+1409.566589583 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data") pod "rabbitmq-cell1-server-0" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1") : configmap "rabbitmq-cell1-config-data" not found Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.832334 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicancd31-account-delete-l9h8d"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.842571 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cd31-account-create-pmq92"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.858347 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-tqm4c"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.870375 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-cd31-account-create-pmq92"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.871255 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b589-account-create-k2shf"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.883363 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-tqm4c"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.923547 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderb589-account-delete-h8q45"] Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.930145 4991 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.930206 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config podName:0a6703e0-1fac-4734-98ac-88f6163fdaae nodeName:}" failed. No retries permitted until 2025-10-06 08:42:37.930192525 +0000 UTC m=+1409.667942536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config") pod "neutron-7988dccf5c-j9ll7" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae") : secret "neutron-config" not found Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.933414 4991 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Oct 06 08:42:35 crc kubenswrapper[4991]: E1006 08:42:35.933475 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config podName:0a6703e0-1fac-4734-98ac-88f6163fdaae nodeName:}" failed. No retries permitted until 2025-10-06 08:42:37.933460079 +0000 UTC m=+1409.671210100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config") pod "neutron-7988dccf5c-j9ll7" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae") : secret "neutron-httpd-config" not found Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.947445 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b589-account-create-k2shf"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.973370 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-75c547987d-brwwk"] Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.973574 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" podUID="ab7f3760-250c-4e34-8bde-7e9218b711ff" containerName="barbican-keystone-listener-log" containerID="cri-o://118c1de5d5587e621349f796c4342c648e5951d232a1d0e3dcb3fd1f0b4f7705" gracePeriod=30 Oct 06 08:42:35 crc kubenswrapper[4991]: I1006 08:42:35.973921 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" podUID="ab7f3760-250c-4e34-8bde-7e9218b711ff" containerName="barbican-keystone-listener" containerID="cri-o://4410132c0aa760f431a4973b217154eb03f1a1acbcd426c9c298f0ff9b2290ca" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.005988 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-79798cd5c5-jz6kb"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.006446 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-79798cd5c5-jz6kb" podUID="b2720ee8-eb06-4a0b-9bee-153b69ee769e" containerName="barbican-worker-log" containerID="cri-o://6cff5f4fbbdd1f8906fc62c1c37a25292d4484752ab929ce099738e8a4117501" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.006687 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-79798cd5c5-jz6kb" podUID="b2720ee8-eb06-4a0b-9bee-153b69ee769e" containerName="barbican-worker" containerID="cri-o://d85323f86704585d0954acacab967e959246d8405dc02badbe6e793e45cbe71b" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.056581 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-548cc795f4-8m4d9"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.056867 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-548cc795f4-8m4d9" podUID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" containerName="barbican-api-log" containerID="cri-o://ed3d4866db94527f98aa6062572670cd20f71dc34b4e9fe3ca2ccfae1b03bda2" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.057388 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-548cc795f4-8m4d9" podUID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" containerName="barbican-api" containerID="cri-o://0c70b35b1a4450b4db02a166e4cb0db2437a7fcc554b453e7d86b3f8efc7685d" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.107690 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.107943 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerName="nova-metadata-log" containerID="cri-o://4c49ecde03a108088eaff49d978ace50e6654f0b6205db59fddf267b0df1faab" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.108076 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerName="nova-metadata-metadata" containerID="cri-o://6a0dd291d385b5c827db71bd9cfc93863a8619475a3e50f8d7d1d405394842f9" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.139896 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.180227 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-m72rm"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.196435 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-m72rm"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.215629 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.216094 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="48f4202b-6558-4fe3-8fcc-732aa1a88e60" containerName="nova-scheduler-scheduler" containerID="cri-o://4f4325397287518c3ecb285a52c75cc737cf34c7fece8ee912a41c376bf55696" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.240710 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi9279-account-delete-bsk7x"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.251190 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-z229c"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.258847 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-z229c"] Oct 06 08:42:36 crc kubenswrapper[4991]: W1006 08:42:36.268631 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod305f56cb_d896_435c_ae06_4a407714b503.slice/crio-e07b6d6d9bc08140ce4a9c1fb9f8b41eeb82670ae60b440d3e025e27bc9e9909 WatchSource:0}: Error finding container e07b6d6d9bc08140ce4a9c1fb9f8b41eeb82670ae60b440d3e025e27bc9e9909: Status 404 returned error can't find the container with id e07b6d6d9bc08140ce4a9c1fb9f8b41eeb82670ae60b440d3e025e27bc9e9909 Oct 06 08:42:36 crc kubenswrapper[4991]: E1006 08:42:36.268785 4991 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 06 08:42:36 crc kubenswrapper[4991]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 06 08:42:36 crc kubenswrapper[4991]: + source /usr/local/bin/container-scripts/functions Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNBridge=br-int Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNRemote=tcp:localhost:6642 Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNEncapType=geneve Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNAvailabilityZones= Oct 06 08:42:36 crc kubenswrapper[4991]: ++ EnableChassisAsGateway=true Oct 06 08:42:36 crc kubenswrapper[4991]: ++ PhysicalNetworks= Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNHostName= Oct 06 08:42:36 crc kubenswrapper[4991]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 06 08:42:36 crc kubenswrapper[4991]: ++ ovs_dir=/var/lib/openvswitch Oct 06 08:42:36 crc kubenswrapper[4991]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 06 08:42:36 crc kubenswrapper[4991]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 06 08:42:36 crc kubenswrapper[4991]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 06 08:42:36 crc kubenswrapper[4991]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 08:42:36 crc kubenswrapper[4991]: + sleep 0.5 Oct 06 08:42:36 crc kubenswrapper[4991]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 08:42:36 crc kubenswrapper[4991]: + sleep 0.5 Oct 06 08:42:36 crc kubenswrapper[4991]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 08:42:36 crc kubenswrapper[4991]: + cleanup_ovsdb_server_semaphore Oct 06 08:42:36 crc kubenswrapper[4991]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 06 08:42:36 crc kubenswrapper[4991]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 06 08:42:36 crc kubenswrapper[4991]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-5prwt" message=< Oct 06 08:42:36 crc kubenswrapper[4991]: Exiting ovsdb-server (5) ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 06 08:42:36 crc kubenswrapper[4991]: + source /usr/local/bin/container-scripts/functions Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNBridge=br-int Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNRemote=tcp:localhost:6642 Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNEncapType=geneve Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNAvailabilityZones= Oct 06 08:42:36 crc kubenswrapper[4991]: ++ EnableChassisAsGateway=true Oct 06 08:42:36 crc kubenswrapper[4991]: ++ PhysicalNetworks= Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNHostName= Oct 06 08:42:36 crc kubenswrapper[4991]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 06 08:42:36 crc kubenswrapper[4991]: ++ ovs_dir=/var/lib/openvswitch Oct 06 08:42:36 crc kubenswrapper[4991]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 06 08:42:36 crc kubenswrapper[4991]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 06 08:42:36 crc kubenswrapper[4991]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 06 08:42:36 crc kubenswrapper[4991]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 08:42:36 crc kubenswrapper[4991]: + sleep 0.5 Oct 06 08:42:36 crc kubenswrapper[4991]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 08:42:36 crc kubenswrapper[4991]: + sleep 0.5 Oct 06 08:42:36 crc kubenswrapper[4991]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 08:42:36 crc kubenswrapper[4991]: + cleanup_ovsdb_server_semaphore Oct 06 08:42:36 crc kubenswrapper[4991]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 06 08:42:36 crc kubenswrapper[4991]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 06 08:42:36 crc kubenswrapper[4991]: > Oct 06 08:42:36 crc kubenswrapper[4991]: E1006 08:42:36.268841 4991 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 06 08:42:36 crc kubenswrapper[4991]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 06 08:42:36 crc kubenswrapper[4991]: + source /usr/local/bin/container-scripts/functions Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNBridge=br-int Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNRemote=tcp:localhost:6642 Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNEncapType=geneve Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNAvailabilityZones= Oct 06 08:42:36 crc kubenswrapper[4991]: ++ EnableChassisAsGateway=true Oct 06 08:42:36 crc kubenswrapper[4991]: ++ PhysicalNetworks= Oct 06 08:42:36 crc kubenswrapper[4991]: ++ OVNHostName= Oct 06 08:42:36 crc kubenswrapper[4991]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 06 08:42:36 crc kubenswrapper[4991]: ++ ovs_dir=/var/lib/openvswitch Oct 06 08:42:36 crc kubenswrapper[4991]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 06 08:42:36 crc kubenswrapper[4991]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 06 08:42:36 crc kubenswrapper[4991]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 06 08:42:36 crc kubenswrapper[4991]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 08:42:36 crc kubenswrapper[4991]: + sleep 0.5 Oct 06 08:42:36 crc kubenswrapper[4991]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 08:42:36 crc kubenswrapper[4991]: + sleep 0.5 Oct 06 08:42:36 crc kubenswrapper[4991]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 06 08:42:36 crc kubenswrapper[4991]: + cleanup_ovsdb_server_semaphore Oct 06 08:42:36 crc kubenswrapper[4991]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 06 08:42:36 crc kubenswrapper[4991]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 06 08:42:36 crc kubenswrapper[4991]: > pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server" containerID="cri-o://2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.268894 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server" containerID="cri-o://2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" gracePeriod=29 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.272653 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ea2a-account-create-4hp6d"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.291577 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-9279-account-create-qgf7w"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.341789 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-9279-account-create-qgf7w"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.350405 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ea2a-account-create-4hp6d"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.358388 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.358625 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="23e696d7-7767-4a92-9828-a189ffb52275" containerName="nova-api-log" containerID="cri-o://832edd5d33c524ced05fce73559b98b910c69bcaa4f037231d4db46add5712d9" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.358747 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="23e696d7-7767-4a92-9828-a189ffb52275" containerName="nova-api-api" containerID="cri-o://98d4fcdfdc9774dff4624bf92e206f1e36780461435c0e70b7a79655aa1bd813" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.362552 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" event={"ID":"ab7f3760-250c-4e34-8bde-7e9218b711ff","Type":"ContainerDied","Data":"118c1de5d5587e621349f796c4342c648e5951d232a1d0e3dcb3fd1f0b4f7705"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.365983 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.374341 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1ea2a-account-delete-5smxw"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.376924 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovs-vswitchd" containerID="cri-o://6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" gracePeriod=29 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.378627 4991 generic.go:334] "Generic (PLEG): container finished" podID="ab7f3760-250c-4e34-8bde-7e9218b711ff" containerID="118c1de5d5587e621349f796c4342c648e5951d232a1d0e3dcb3fd1f0b4f7705" exitCode=143 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.390817 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a45f-account-create-n424l"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.393725 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6ad6d483-bca3-4391-9e4c-290b6b15b1f4/ovsdbserver-nb/0.log" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.393954 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.394040 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"815c282e-cc40-4ff8-b3f8-155d9a91a20b","Type":"ContainerDied","Data":"47090eb349924642f543c04a66d3390950a21742c182060091cfbb40d99efe76"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.393994 4991 generic.go:334] "Generic (PLEG): container finished" podID="815c282e-cc40-4ff8-b3f8-155d9a91a20b" containerID="47090eb349924642f543c04a66d3390950a21742c182060091cfbb40d99efe76" exitCode=143 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.397993 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance15db-account-delete-rpvcb" event={"ID":"50622552-6b5c-4af5-a457-09c526c54f3f","Type":"ContainerStarted","Data":"4b1c4cb63acfc187b27cb2a3dfa633dfdc736b06e498d21d81a3c0e9609fb340"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.406248 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-6pm4v"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.410049 4991 generic.go:334] "Generic (PLEG): container finished" podID="d1a24973-6ef6-4732-9a96-040ce646a707" containerID="2f29341e126502f19b2fe665eb6f63634e44f634ecd075749718883b8f004d5b" exitCode=143 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.410125 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d1a24973-6ef6-4732-9a96-040ce646a707","Type":"ContainerDied","Data":"2f29341e126502f19b2fe665eb6f63634e44f634ecd075749718883b8f004d5b"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.422806 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-df86g_0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6/openstack-network-exporter/0.log" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.422884 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.426873 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-6pm4v"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.436976 4991 generic.go:334] "Generic (PLEG): container finished" podID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" containerID="ed3d4866db94527f98aa6062572670cd20f71dc34b4e9fe3ca2ccfae1b03bda2" exitCode=143 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.437060 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-548cc795f4-8m4d9" event={"ID":"a9be32ba-d183-4fd5-ba8b-63f79c973c81","Type":"ContainerDied","Data":"ed3d4866db94527f98aa6062572670cd20f71dc34b4e9fe3ca2ccfae1b03bda2"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.445028 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a45f-account-create-n424l"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.450917 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.464846 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1b135498-feb3-4024-b655-92f403f55bb9/ovsdbserver-sb/0.log" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.464923 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.477061 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.477258 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f4175b5d-7866-481a-a923-1ae5f3307195" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c3b1614500005292c9e7b6920ac4a7cc87e019fd8e824585e552366b6101a5ab" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.496717 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.496896 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="c5b53689-326b-4f4c-a625-beec7a3631fa" containerName="nova-cell1-conductor-conductor" containerID="cri-o://05fd087fc4e56815232a45eddb3364d72ba9e9e329ba6d624cee180ef68e0693" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500533 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="25950ee93c182d2a8f2b482674bcf125f0ce2007882775e9431029f2d5153184" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500742 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="662006c1a00d0cac716c8677f83ad79a7b88245c89d3c05d4a41987440c0babd" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500742 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"25950ee93c182d2a8f2b482674bcf125f0ce2007882775e9431029f2d5153184"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500764 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="264ceea5be73f445fe8809bba7e4a58faeb85d87ce005ae2e2337c4fbd772807" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500772 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="8619a7be0d8b8d3e157358434fab68c5d39a5c107bae0e507da39b55321787f9" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500780 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="fac75ff26b47c3f0e62bea6d62aa82cb9e5265892c9bea171fe5b4d799545d4b" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500787 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="d9388ecf0c6db1afc9baa8762ef9460101639492f4059916a5452baf6ce1da9b" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500792 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="aebf96364238cb6b3d252db6049f87fc6c27dc0650a174ecda7b2742358b2979" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500799 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="cc510399cff86b9534906da4fd4dfb566ffc21c65dc3e7a29de4d1e16e9e7f7a" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500808 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="abaa2e04344e35bc84fdbd617310659cf3403a7924fe6ea867f216abcc6fa8c7" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500816 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="18a56e04769a024151f561f4820a607601164263d72a0ba3ba3c5a8eb7b72631" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500822 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="3b537ff709c1788e201f7be5c9872d032b3f628ae4187cee84bd9ddc9645c96c" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500828 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="cacc49468ee93ceabe894ccc8d50085a9655611b6c4501bf305bb67771d140e5" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500833 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="c9ef1fa176e4762e4800cf8c17d38583018327434b1f427f17c6368143ce1443" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500840 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="ed12c4a932f30894215eff330feb00b02897cadb829ca357ed1fd45e5afdf1b3" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500799 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"662006c1a00d0cac716c8677f83ad79a7b88245c89d3c05d4a41987440c0babd"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500886 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"264ceea5be73f445fe8809bba7e4a58faeb85d87ce005ae2e2337c4fbd772807"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500896 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"8619a7be0d8b8d3e157358434fab68c5d39a5c107bae0e507da39b55321787f9"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500905 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"fac75ff26b47c3f0e62bea6d62aa82cb9e5265892c9bea171fe5b4d799545d4b"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500914 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"d9388ecf0c6db1afc9baa8762ef9460101639492f4059916a5452baf6ce1da9b"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500924 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"aebf96364238cb6b3d252db6049f87fc6c27dc0650a174ecda7b2742358b2979"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500933 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"cc510399cff86b9534906da4fd4dfb566ffc21c65dc3e7a29de4d1e16e9e7f7a"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500942 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"abaa2e04344e35bc84fdbd617310659cf3403a7924fe6ea867f216abcc6fa8c7"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500950 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"18a56e04769a024151f561f4820a607601164263d72a0ba3ba3c5a8eb7b72631"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500959 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"3b537ff709c1788e201f7be5c9872d032b3f628ae4187cee84bd9ddc9645c96c"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500968 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"cacc49468ee93ceabe894ccc8d50085a9655611b6c4501bf305bb67771d140e5"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500976 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"c9ef1fa176e4762e4800cf8c17d38583018327434b1f427f17c6368143ce1443"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.500984 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"ed12c4a932f30894215eff330feb00b02897cadb829ca357ed1fd45e5afdf1b3"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.508143 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-smsnb"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.514777 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-smsnb"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.518761 4991 generic.go:334] "Generic (PLEG): container finished" podID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerID="4c49ecde03a108088eaff49d978ace50e6654f0b6205db59fddf267b0df1faab" exitCode=143 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.518822 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70e2b1c5-03aa-4472-9002-7daf936edc67","Type":"ContainerDied","Data":"4c49ecde03a108088eaff49d978ace50e6654f0b6205db59fddf267b0df1faab"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.521416 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zzkxl"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.522504 4991 generic.go:334] "Generic (PLEG): container finished" podID="b2720ee8-eb06-4a0b-9bee-153b69ee769e" containerID="6cff5f4fbbdd1f8906fc62c1c37a25292d4484752ab929ce099738e8a4117501" exitCode=143 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.522571 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-79798cd5c5-jz6kb" event={"ID":"b2720ee8-eb06-4a0b-9bee-153b69ee769e","Type":"ContainerDied","Data":"6cff5f4fbbdd1f8906fc62c1c37a25292d4484752ab929ce099738e8a4117501"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.524055 4991 generic.go:334] "Generic (PLEG): container finished" podID="e8e91b06-a3c1-41dc-b2f8-af738647ade8" containerID="81c13c33c57ac2a7fafdd527f3ce9a1ddf23d76bafb0a388ddb5af43c282a8b4" exitCode=137 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.525615 4991 generic.go:334] "Generic (PLEG): container finished" podID="feb6a9a7-403e-4dc9-903c-349391d84efb" containerID="40cc2581ab3ca423c98e61d01fbf933e125eced21752dcff956d71eaf1890135" exitCode=143 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.525652 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b98fcbb5b-2m256" event={"ID":"feb6a9a7-403e-4dc9-903c-349391d84efb","Type":"ContainerDied","Data":"40cc2581ab3ca423c98e61d01fbf933e125eced21752dcff956d71eaf1890135"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.526475 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicancd31-account-delete-l9h8d" event={"ID":"305f56cb-d896-435c-ae06-4a407714b503","Type":"ContainerStarted","Data":"e07b6d6d9bc08140ce4a9c1fb9f8b41eeb82670ae60b440d3e025e27bc9e9909"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.527614 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" event={"ID":"2d06311c-e246-4d3d-ba9c-388cb800ac4f","Type":"ContainerDied","Data":"8837db3f318960e4eaf4d8c389f5127148a69eb983ba046be127e70ea5f58f7d"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.527654 4991 scope.go:117] "RemoveContainer" containerID="d8899a5ea677f40567793637ebd89b430187e401ecc0a4df4ac6944a237de212" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.527748 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-6qdfr" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.532561 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zzkxl"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.535528 4991 generic.go:334] "Generic (PLEG): container finished" podID="0a6703e0-1fac-4734-98ac-88f6163fdaae" containerID="93e5b235f20e302b6749df9897200518a9608b53c7db75afd7a755bd7c31a9e2" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.535570 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7988dccf5c-j9ll7" event={"ID":"0a6703e0-1fac-4734-98ac-88f6163fdaae","Type":"ContainerDied","Data":"93e5b235f20e302b6749df9897200518a9608b53c7db75afd7a755bd7c31a9e2"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.540021 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6ad6d483-bca3-4391-9e4c-290b6b15b1f4/ovsdbserver-nb/0.log" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.540063 4991 generic.go:334] "Generic (PLEG): container finished" podID="6ad6d483-bca3-4391-9e4c-290b6b15b1f4" containerID="d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e" exitCode=143 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.540119 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6ad6d483-bca3-4391-9e4c-290b6b15b1f4","Type":"ContainerDied","Data":"d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.540138 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6ad6d483-bca3-4391-9e4c-290b6b15b1f4","Type":"ContainerDied","Data":"4ca1911997e9bbd3466058aaa12c187f5778061ad0ecd52fb889992119a4044f"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.540188 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.540888 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.541337 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="697548ef-9b89-4827-a5f1-4e535ae94722" containerName="nova-cell0-conductor-conductor" containerID="cri-o://758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.546242 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qml8q\" (UniqueName: \"kubernetes.io/projected/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-kube-api-access-qml8q\") pod \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.546352 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-ovsdbserver-nb-tls-certs\") pod \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.547323 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-ovsdbserver-sb\") pod \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.547363 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-metrics-certs-tls-certs\") pod \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.547407 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-config\") pod \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.547448 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-config\") pod \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548015 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-metrics-certs-tls-certs\") pod \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548073 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-combined-ca-bundle\") pod \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548098 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-ovs-rundir\") pod \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548133 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q77j\" (UniqueName: \"kubernetes.io/projected/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-kube-api-access-6q77j\") pod \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548152 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-config\") pod \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548184 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-scripts\") pod \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548203 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-ovn-rundir\") pod \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548228 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548280 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hs5k\" (UniqueName: \"kubernetes.io/projected/2d06311c-e246-4d3d-ba9c-388cb800ac4f-kube-api-access-6hs5k\") pod \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548349 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-combined-ca-bundle\") pod \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\" (UID: \"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548441 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-ovsdbserver-nb\") pod \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548484 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-ovsdb-rundir\") pod \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\" (UID: \"6ad6d483-bca3-4391-9e4c-290b6b15b1f4\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548512 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-dns-svc\") pod \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.548568 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-dns-swift-storage-0\") pod \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\" (UID: \"2d06311c-e246-4d3d-ba9c-388cb800ac4f\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.549190 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6" (UID: "0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.549653 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6" (UID: "0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.549993 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "6ad6d483-bca3-4391-9e4c-290b6b15b1f4" (UID: "6ad6d483-bca3-4391-9e4c-290b6b15b1f4"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.550012 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-scripts" (OuterVolumeSpecName: "scripts") pod "6ad6d483-bca3-4391-9e4c-290b6b15b1f4" (UID: "6ad6d483-bca3-4391-9e4c-290b6b15b1f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.550063 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-config" (OuterVolumeSpecName: "config") pod "6ad6d483-bca3-4391-9e4c-290b6b15b1f4" (UID: "6ad6d483-bca3-4391-9e4c-290b6b15b1f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.550673 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-config" (OuterVolumeSpecName: "config") pod "0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6" (UID: "0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.554675 4991 generic.go:334] "Generic (PLEG): container finished" podID="aa57b1fb-c743-4137-9501-a0110f385b1c" containerID="f83ef24bc48b9f3df4545258f80864d16d673fc45ac88575e0e485addba7df62" exitCode=143 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.554768 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa57b1fb-c743-4137-9501-a0110f385b1c","Type":"ContainerDied","Data":"f83ef24bc48b9f3df4545258f80864d16d673fc45ac88575e0e485addba7df62"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.556584 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-kube-api-access-qml8q" (OuterVolumeSpecName: "kube-api-access-qml8q") pod "6ad6d483-bca3-4391-9e4c-290b6b15b1f4" (UID: "6ad6d483-bca3-4391-9e4c-290b6b15b1f4"). InnerVolumeSpecName "kube-api-access-qml8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.557627 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-df86g_0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6/openstack-network-exporter/0.log" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.557669 4991 generic.go:334] "Generic (PLEG): container finished" podID="0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6" containerID="ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb" exitCode=2 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.557708 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-df86g" event={"ID":"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6","Type":"ContainerDied","Data":"ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.557731 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-df86g" event={"ID":"0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6","Type":"ContainerDied","Data":"557b7992f30f35d6f7674d6a98d19517cd682b267963b24a83c30d22b62d0339"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.557778 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-df86g" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.562572 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.569981 4991 generic.go:334] "Generic (PLEG): container finished" podID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" exitCode=0 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.570036 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5prwt" event={"ID":"63c7d8f9-5c85-4999-b60b-517b03ff5992","Type":"ContainerDied","Data":"2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.574501 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1b135498-feb3-4024-b655-92f403f55bb9/ovsdbserver-sb/0.log" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.574543 4991 generic.go:334] "Generic (PLEG): container finished" podID="1b135498-feb3-4024-b655-92f403f55bb9" containerID="36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b" exitCode=143 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.574631 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.575336 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1b135498-feb3-4024-b655-92f403f55bb9","Type":"ContainerDied","Data":"36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.578924 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "6ad6d483-bca3-4391-9e4c-290b6b15b1f4" (UID: "6ad6d483-bca3-4391-9e4c-290b6b15b1f4"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.586335 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronb842-account-delete-b9lmt"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.586380 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronb842-account-delete-b9lmt" event={"ID":"4f1297ce-72cf-4b07-a66d-826e8e9c1663","Type":"ContainerStarted","Data":"fb50fe6f781ad98b7568e7845a3eb4387b552f0fb9d542189b5c26f5d7d30f84"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.587587 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementd82d-account-delete-jlr78" event={"ID":"87faa73b-1148-48ae-88f4-3bdd06898658","Type":"ContainerStarted","Data":"1f4913800a4fbe96c73ba606ca910bdb6f240e725720313f501e653b814e1009"} Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.595503 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-kube-api-access-6q77j" (OuterVolumeSpecName: "kube-api-access-6q77j") pod "0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6" (UID: "0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6"). InnerVolumeSpecName "kube-api-access-6q77j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.595554 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d06311c-e246-4d3d-ba9c-388cb800ac4f-kube-api-access-6hs5k" (OuterVolumeSpecName: "kube-api-access-6hs5k") pod "2d06311c-e246-4d3d-ba9c-388cb800ac4f" (UID: "2d06311c-e246-4d3d-ba9c-388cb800ac4f"). InnerVolumeSpecName "kube-api-access-6hs5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.613017 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicancd31-account-delete-l9h8d"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.621132 4991 scope.go:117] "RemoveContainer" containerID="1ffa5d61a2db9844f49f116652b16947e4d804b4b63149c5790e80ca525ec7f3" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.649602 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95x62\" (UniqueName: \"kubernetes.io/projected/1b135498-feb3-4024-b655-92f403f55bb9-kube-api-access-95x62\") pod \"1b135498-feb3-4024-b655-92f403f55bb9\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.649820 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-ovsdbserver-sb-tls-certs\") pod \"1b135498-feb3-4024-b655-92f403f55bb9\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.649913 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-metrics-certs-tls-certs\") pod \"1b135498-feb3-4024-b655-92f403f55bb9\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.650111 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1b135498-feb3-4024-b655-92f403f55bb9\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.650222 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1b135498-feb3-4024-b655-92f403f55bb9-ovsdb-rundir\") pod \"1b135498-feb3-4024-b655-92f403f55bb9\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.650366 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-combined-ca-bundle\") pod \"1b135498-feb3-4024-b655-92f403f55bb9\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.650459 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b135498-feb3-4024-b655-92f403f55bb9-config\") pod \"1b135498-feb3-4024-b655-92f403f55bb9\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.650600 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b135498-feb3-4024-b655-92f403f55bb9-scripts\") pod \"1b135498-feb3-4024-b655-92f403f55bb9\" (UID: \"1b135498-feb3-4024-b655-92f403f55bb9\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.651031 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b135498-feb3-4024-b655-92f403f55bb9-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "1b135498-feb3-4024-b655-92f403f55bb9" (UID: "1b135498-feb3-4024-b655-92f403f55bb9"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.651145 4991 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.651213 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q77j\" (UniqueName: \"kubernetes.io/projected/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-kube-api-access-6q77j\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.651268 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.651396 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.651476 4991 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.651540 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.651593 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hs5k\" (UniqueName: \"kubernetes.io/projected/2d06311c-e246-4d3d-ba9c-388cb800ac4f-kube-api-access-6hs5k\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.651677 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.651747 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qml8q\" (UniqueName: \"kubernetes.io/projected/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-kube-api-access-qml8q\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.651802 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.653798 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b135498-feb3-4024-b655-92f403f55bb9-config" (OuterVolumeSpecName: "config") pod "1b135498-feb3-4024-b655-92f403f55bb9" (UID: "1b135498-feb3-4024-b655-92f403f55bb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: E1006 08:42:36.657954 4991 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.657972 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b135498-feb3-4024-b655-92f403f55bb9-scripts" (OuterVolumeSpecName: "scripts") pod "1b135498-feb3-4024-b655-92f403f55bb9" (UID: "1b135498-feb3-4024-b655-92f403f55bb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: E1006 08:42:36.658020 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data podName:53c6aca4-4fd0-4d42-bbe2-4b6e91643503 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:38.658002278 +0000 UTC m=+1410.395752309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data") pod "rabbitmq-server-0" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503") : configmap "rabbitmq-config-data" not found Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.660259 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.667706 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b135498-feb3-4024-b655-92f403f55bb9-kube-api-access-95x62" (OuterVolumeSpecName: "kube-api-access-95x62") pod "1b135498-feb3-4024-b655-92f403f55bb9" (UID: "1b135498-feb3-4024-b655-92f403f55bb9"). InnerVolumeSpecName "kube-api-access-95x62". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.669967 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" containerName="rabbitmq" containerID="cri-o://3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab" gracePeriod=604800 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.673414 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="157f3f65-3397-4a2d-98ea-1ae5897c7a76" containerName="galera" containerID="cri-o://94c589983290634c76235daa1990cab452138af9c99951302ddc413d46fc20a4" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.686742 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "1b135498-feb3-4024-b655-92f403f55bb9" (UID: "1b135498-feb3-4024-b655-92f403f55bb9"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.728681 4991 scope.go:117] "RemoveContainer" containerID="10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.750669 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderb589-account-delete-h8q45"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.754873 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8e91b06-a3c1-41dc-b2f8-af738647ade8-openstack-config-secret\") pod \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.755016 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzk82\" (UniqueName: \"kubernetes.io/projected/e8e91b06-a3c1-41dc-b2f8-af738647ade8-kube-api-access-zzk82\") pod \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.755035 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8e91b06-a3c1-41dc-b2f8-af738647ade8-openstack-config\") pod \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.755099 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8e91b06-a3c1-41dc-b2f8-af738647ade8-combined-ca-bundle\") pod \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\" (UID: \"e8e91b06-a3c1-41dc-b2f8-af738647ade8\") " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.755518 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.755534 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1b135498-feb3-4024-b655-92f403f55bb9-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.755544 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b135498-feb3-4024-b655-92f403f55bb9-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.755554 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1b135498-feb3-4024-b655-92f403f55bb9-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.755562 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95x62\" (UniqueName: \"kubernetes.io/projected/1b135498-feb3-4024-b655-92f403f55bb9-kube-api-access-95x62\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.779449 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1ea2a-account-delete-5smxw"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.791590 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="53c6aca4-4fd0-4d42-bbe2-4b6e91643503" containerName="rabbitmq" containerID="cri-o://304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c" gracePeriod=604800 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.805024 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8e91b06-a3c1-41dc-b2f8-af738647ade8-kube-api-access-zzk82" (OuterVolumeSpecName: "kube-api-access-zzk82") pod "e8e91b06-a3c1-41dc-b2f8-af738647ade8" (UID: "e8e91b06-a3c1-41dc-b2f8-af738647ade8"). InnerVolumeSpecName "kube-api-access-zzk82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.811233 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi9279-account-delete-bsk7x"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.857636 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzk82\" (UniqueName: \"kubernetes.io/projected/e8e91b06-a3c1-41dc-b2f8-af738647ade8-kube-api-access-zzk82\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.945480 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-856f6664f9-gqcn7"] Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.945698 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-856f6664f9-gqcn7" podUID="801bcc07-7874-4eb8-8447-40178d80ea09" containerName="proxy-httpd" containerID="cri-o://7145e5dc1f6f11d5b7c94e4ed5a3f94d613b31585eea12e4bfb621d2b89f737e" gracePeriod=30 Oct 06 08:42:36 crc kubenswrapper[4991]: I1006 08:42:36.946094 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-856f6664f9-gqcn7" podUID="801bcc07-7874-4eb8-8447-40178d80ea09" containerName="proxy-server" containerID="cri-o://2b98780e70d84a8aec415e425c48e44718a23c872945a5f7884260c8ef099a6e" gracePeriod=30 Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.009938 4991 scope.go:117] "RemoveContainer" containerID="d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e" Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.086448 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.086580 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.089276 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.089379 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.093722 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.093772 4991 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.107526 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-856f6664f9-gqcn7" podUID="801bcc07-7874-4eb8-8447-40178d80ea09" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.164:8080/healthcheck\": dial tcp 10.217.0.164:8080: connect: connection refused" Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.107576 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.107617 4991 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovs-vswitchd" Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.107643 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4f4325397287518c3ecb285a52c75cc737cf34c7fece8ee912a41c376bf55696" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.107748 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-856f6664f9-gqcn7" podUID="801bcc07-7874-4eb8-8447-40178d80ea09" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.164:8080/healthcheck\": dial tcp 10.217.0.164:8080: connect: connection refused" Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.113355 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4f4325397287518c3ecb285a52c75cc737cf34c7fece8ee912a41c376bf55696" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.114391 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4f4325397287518c3ecb285a52c75cc737cf34c7fece8ee912a41c376bf55696" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.114446 4991 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="48f4202b-6558-4fe3-8fcc-732aa1a88e60" containerName="nova-scheduler-scheduler" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.288716 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0254b022-c378-4fff-bc49-15778c28e8e0" path="/var/lib/kubelet/pods/0254b022-c378-4fff-bc49-15778c28e8e0/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.289218 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f477921-a357-4895-bad9-8489244afd27" path="/var/lib/kubelet/pods/1f477921-a357-4895-bad9-8489244afd27/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.289681 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f29fae6-0696-4c14-8b68-94c800349ada" path="/var/lib/kubelet/pods/2f29fae6-0696-4c14-8b68-94c800349ada/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.290131 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e941ea-76c3-43d1-aa41-7897065fb55a" path="/var/lib/kubelet/pods/37e941ea-76c3-43d1-aa41-7897065fb55a/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.291233 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4926b604-c132-46b9-a156-46ae1662bc9d" path="/var/lib/kubelet/pods/4926b604-c132-46b9-a156-46ae1662bc9d/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.292225 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c3d0945-d5b0-43bc-9ecf-0c0023ba2566" path="/var/lib/kubelet/pods/5c3d0945-d5b0-43bc-9ecf-0c0023ba2566/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.292967 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d55f5bf-87aa-4993-a295-05b740129150" path="/var/lib/kubelet/pods/5d55f5bf-87aa-4993-a295-05b740129150/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.293540 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="735f5180-6fe2-4632-b321-f6c96f3c9400" path="/var/lib/kubelet/pods/735f5180-6fe2-4632-b321-f6c96f3c9400/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.295582 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d9de29-7b2c-4544-8516-fe61912e4da9" path="/var/lib/kubelet/pods/75d9de29-7b2c-4544-8516-fe61912e4da9/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.296086 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a84386-eade-4e4b-a569-0dbce5dc6081" path="/var/lib/kubelet/pods/96a84386-eade-4e4b-a569-0dbce5dc6081/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.296772 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a462bacd-997b-4e65-89d3-1db409e5b26b" path="/var/lib/kubelet/pods/a462bacd-997b-4e65-89d3-1db409e5b26b/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.299558 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adc11447-fe09-4d10-9d49-c064f5fffc7d" path="/var/lib/kubelet/pods/adc11447-fe09-4d10-9d49-c064f5fffc7d/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.300223 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e2805d-da62-4181-8d99-a5180a0c99e7" path="/var/lib/kubelet/pods/b5e2805d-da62-4181-8d99-a5180a0c99e7/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.300875 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b61822bc-f709-47be-b2ed-71284622cbe1" path="/var/lib/kubelet/pods/b61822bc-f709-47be-b2ed-71284622cbe1/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.305431 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b" path="/var/lib/kubelet/pods/c8b352d5-fbc5-477c-bbe0-88eb5f3ed55b/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.317514 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb178ee5-a91d-4778-96f8-03cac37c55f5" path="/var/lib/kubelet/pods/cb178ee5-a91d-4778-96f8-03cac37c55f5/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.318413 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b6a26f-f25f-401d-a645-e94f9815314c" path="/var/lib/kubelet/pods/d4b6a26f-f25f-401d-a645-e94f9815314c/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.319065 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6625a18-265d-4c50-8841-f36e4f59d79f" path="/var/lib/kubelet/pods/d6625a18-265d-4c50-8841-f36e4f59d79f/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.319634 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea12773e-89f6-478d-92f5-23bfb4a05a6a" path="/var/lib/kubelet/pods/ea12773e-89f6-478d-92f5-23bfb4a05a6a/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.321094 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec36c4e8-0d7b-4570-bb22-16367889063f" path="/var/lib/kubelet/pods/ec36c4e8-0d7b-4570-bb22-16367889063f/volumes" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.338886 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.341914 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.366360 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8e91b06-a3c1-41dc-b2f8-af738647ade8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8e91b06-a3c1-41dc-b2f8-af738647ade8" (UID: "e8e91b06-a3c1-41dc-b2f8-af738647ade8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.373271 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8e91b06-a3c1-41dc-b2f8-af738647ade8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.373319 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.373332 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.418624 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6" (UID: "0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.446465 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-config" (OuterVolumeSpecName: "config") pod "2d06311c-e246-4d3d-ba9c-388cb800ac4f" (UID: "2d06311c-e246-4d3d-ba9c-388cb800ac4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.482864 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.483184 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.555753 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ad6d483-bca3-4391-9e4c-290b6b15b1f4" (UID: "6ad6d483-bca3-4391-9e4c-290b6b15b1f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.556896 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b135498-feb3-4024-b655-92f403f55bb9" (UID: "1b135498-feb3-4024-b655-92f403f55bb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.566486 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8e91b06-a3c1-41dc-b2f8-af738647ade8-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e8e91b06-a3c1-41dc-b2f8-af738647ade8" (UID: "e8e91b06-a3c1-41dc-b2f8-af738647ade8"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.586447 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.586469 4991 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8e91b06-a3c1-41dc-b2f8-af738647ade8-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.586478 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.623419 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "6ad6d483-bca3-4391-9e4c-290b6b15b1f4" (UID: "6ad6d483-bca3-4391-9e4c-290b6b15b1f4"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.625980 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2d06311c-e246-4d3d-ba9c-388cb800ac4f" (UID: "2d06311c-e246-4d3d-ba9c-388cb800ac4f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.626025 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2d06311c-e246-4d3d-ba9c-388cb800ac4f" (UID: "2d06311c-e246-4d3d-ba9c-388cb800ac4f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.641767 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2d06311c-e246-4d3d-ba9c-388cb800ac4f" (UID: "2d06311c-e246-4d3d-ba9c-388cb800ac4f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.660394 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6" (UID: "0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.673813 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8e91b06-a3c1-41dc-b2f8-af738647ade8-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e8e91b06-a3c1-41dc-b2f8-af738647ade8" (UID: "e8e91b06-a3c1-41dc-b2f8-af738647ade8"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.677161 4991 generic.go:334] "Generic (PLEG): container finished" podID="23e696d7-7767-4a92-9828-a189ffb52275" containerID="832edd5d33c524ced05fce73559b98b910c69bcaa4f037231d4db46add5712d9" exitCode=143 Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.691693 4991 generic.go:334] "Generic (PLEG): container finished" podID="b2720ee8-eb06-4a0b-9bee-153b69ee769e" containerID="d85323f86704585d0954acacab967e959246d8405dc02badbe6e793e45cbe71b" exitCode=0 Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.694662 4991 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8e91b06-a3c1-41dc-b2f8-af738647ade8-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.694709 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.694721 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.694736 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.694747 4991 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.694756 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.696040 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "1b135498-feb3-4024-b655-92f403f55bb9" (UID: "1b135498-feb3-4024-b655-92f403f55bb9"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.699793 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6ad6d483-bca3-4391-9e4c-290b6b15b1f4" (UID: "6ad6d483-bca3-4391-9e4c-290b6b15b1f4"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.703900 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1b135498-feb3-4024-b655-92f403f55bb9" (UID: "1b135498-feb3-4024-b655-92f403f55bb9"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.712821 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d06311c-e246-4d3d-ba9c-388cb800ac4f" (UID: "2d06311c-e246-4d3d-ba9c-388cb800ac4f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.714622 4991 generic.go:334] "Generic (PLEG): container finished" podID="87faa73b-1148-48ae-88f4-3bdd06898658" containerID="c6b5f60779d336e5fbce098f6d9e1800e575b0fba43736ffc5139e17473ca9ec" exitCode=0 Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.765691 4991 generic.go:334] "Generic (PLEG): container finished" podID="f4175b5d-7866-481a-a923-1ae5f3307195" containerID="c3b1614500005292c9e7b6920ac4a7cc87e019fd8e824585e552366b6101a5ab" exitCode=0 Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.771273 4991 generic.go:334] "Generic (PLEG): container finished" podID="ab7f3760-250c-4e34-8bde-7e9218b711ff" containerID="4410132c0aa760f431a4973b217154eb03f1a1acbcd426c9c298f0ff9b2290ca" exitCode=0 Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.773000 4991 generic.go:334] "Generic (PLEG): container finished" podID="4dd2d34c-a29e-47b8-98b4-f75fffb11673" containerID="d649d548626a4bd3bff872429af0bef8f3a02f2808a38286a2013d34229a5407" exitCode=0 Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.774876 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1b135498-feb3-4024-b655-92f403f55bb9/ovsdbserver-sb/0.log" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.780366 4991 generic.go:334] "Generic (PLEG): container finished" podID="4f1297ce-72cf-4b07-a66d-826e8e9c1663" containerID="8e9dca7e656636d8cc5ec73d30b74365cb11a8a92789fecf7545e4cc91f5646c" exitCode=0 Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.804603 4991 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.806036 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b135498-feb3-4024-b655-92f403f55bb9-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.806091 4991 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d06311c-e246-4d3d-ba9c-388cb800ac4f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.806103 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ad6d483-bca3-4391-9e4c-290b6b15b1f4-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.820606 4991 generic.go:334] "Generic (PLEG): container finished" podID="801bcc07-7874-4eb8-8447-40178d80ea09" containerID="2b98780e70d84a8aec415e425c48e44718a23c872945a5f7884260c8ef099a6e" exitCode=0 Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.820629 4991 generic.go:334] "Generic (PLEG): container finished" podID="801bcc07-7874-4eb8-8447-40178d80ea09" containerID="7145e5dc1f6f11d5b7c94e4ed5a3f94d613b31585eea12e4bfb621d2b89f737e" exitCode=0 Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.822725 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.835355 4991 generic.go:334] "Generic (PLEG): container finished" podID="50622552-6b5c-4af5-a457-09c526c54f3f" containerID="cae646f42382972d5beded15a444697216db639c9bec708007612049ee7f8e6f" exitCode=0 Oct 06 08:42:37 crc kubenswrapper[4991]: I1006 08:42:37.841887 4991 generic.go:334] "Generic (PLEG): container finished" podID="305f56cb-d896-435c-ae06-4a407714b503" containerID="87310a359b34691e77d2310cf9bb176e0eebac4a7cd386913eb46acf7f2817cf" exitCode=0 Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.909701 4991 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 06 08:42:37 crc kubenswrapper[4991]: E1006 08:42:37.909760 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data podName:1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:41.909744849 +0000 UTC m=+1413.647494870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data") pod "rabbitmq-cell1-server-0" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1") : configmap "rabbitmq-cell1-config-data" not found Oct 06 08:42:38 crc kubenswrapper[4991]: E1006 08:42:38.014625 4991 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Oct 06 08:42:38 crc kubenswrapper[4991]: E1006 08:42:38.014688 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config podName:0a6703e0-1fac-4734-98ac-88f6163fdaae nodeName:}" failed. No retries permitted until 2025-10-06 08:42:42.014674835 +0000 UTC m=+1413.752424856 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config") pod "neutron-7988dccf5c-j9ll7" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae") : secret "neutron-config" not found Oct 06 08:42:38 crc kubenswrapper[4991]: E1006 08:42:38.014990 4991 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Oct 06 08:42:38 crc kubenswrapper[4991]: E1006 08:42:38.015035 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config podName:0a6703e0-1fac-4734-98ac-88f6163fdaae nodeName:}" failed. No retries permitted until 2025-10-06 08:42:42.015028045 +0000 UTC m=+1413.752778066 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config") pod "neutron-7988dccf5c-j9ll7" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae") : secret "neutron-httpd-config" not found Oct 06 08:42:38 crc kubenswrapper[4991]: E1006 08:42:38.604047 4991 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.354s" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604477 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23e696d7-7767-4a92-9828-a189ffb52275","Type":"ContainerDied","Data":"832edd5d33c524ced05fce73559b98b910c69bcaa4f037231d4db46add5712d9"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604509 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-79798cd5c5-jz6kb" event={"ID":"b2720ee8-eb06-4a0b-9bee-153b69ee769e","Type":"ContainerDied","Data":"d85323f86704585d0954acacab967e959246d8405dc02badbe6e793e45cbe71b"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604529 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementd82d-account-delete-jlr78" event={"ID":"87faa73b-1148-48ae-88f4-3bdd06898658","Type":"ContainerDied","Data":"c6b5f60779d336e5fbce098f6d9e1800e575b0fba43736ffc5139e17473ca9ec"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604544 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1ea2a-account-delete-5smxw" event={"ID":"f2791937-a79f-4d99-b895-6d3ac79ba220","Type":"ContainerStarted","Data":"d9d372f6649b7ae2da7a793afd5a30828d575d6c4933357147853b3bad61a125"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604559 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f4175b5d-7866-481a-a923-1ae5f3307195","Type":"ContainerDied","Data":"c3b1614500005292c9e7b6920ac4a7cc87e019fd8e824585e552366b6101a5ab"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604572 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f4175b5d-7866-481a-a923-1ae5f3307195","Type":"ContainerDied","Data":"45896814893e4b4e27834ab230ff711bd7c04a9188e406652f3bd809ecd5fb5c"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604583 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45896814893e4b4e27834ab230ff711bd7c04a9188e406652f3bd809ecd5fb5c" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604597 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" event={"ID":"ab7f3760-250c-4e34-8bde-7e9218b711ff","Type":"ContainerDied","Data":"4410132c0aa760f431a4973b217154eb03f1a1acbcd426c9c298f0ff9b2290ca"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604611 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4dd2d34c-a29e-47b8-98b4-f75fffb11673","Type":"ContainerDied","Data":"d649d548626a4bd3bff872429af0bef8f3a02f2808a38286a2013d34229a5407"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604626 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1b135498-feb3-4024-b655-92f403f55bb9","Type":"ContainerDied","Data":"222efa71f1bf216a77d131176cd2facbb90b30333355df78c9608e6b61ee430c"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604642 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronb842-account-delete-b9lmt" event={"ID":"4f1297ce-72cf-4b07-a66d-826e8e9c1663","Type":"ContainerDied","Data":"8e9dca7e656636d8cc5ec73d30b74365cb11a8a92789fecf7545e4cc91f5646c"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604657 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-856f6664f9-gqcn7" event={"ID":"801bcc07-7874-4eb8-8447-40178d80ea09","Type":"ContainerDied","Data":"2b98780e70d84a8aec415e425c48e44718a23c872945a5f7884260c8ef099a6e"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604672 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-856f6664f9-gqcn7" event={"ID":"801bcc07-7874-4eb8-8447-40178d80ea09","Type":"ContainerDied","Data":"7145e5dc1f6f11d5b7c94e4ed5a3f94d613b31585eea12e4bfb621d2b89f737e"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604687 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi9279-account-delete-bsk7x" event={"ID":"2b01de4c-42f4-4928-916a-6a9638340718","Type":"ContainerStarted","Data":"8b9fb576189ea0b5bcf5af20288a8d25ce827ec05eb093f6aed9ec5455262ba0"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604703 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance15db-account-delete-rpvcb" event={"ID":"50622552-6b5c-4af5-a457-09c526c54f3f","Type":"ContainerDied","Data":"cae646f42382972d5beded15a444697216db639c9bec708007612049ee7f8e6f"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604718 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderb589-account-delete-h8q45" event={"ID":"7d3b515a-b48d-48f7-8775-a0299e07f231","Type":"ContainerStarted","Data":"4c3a5e54b8632d36c596bec64e3dbf3296c1597a7040c4b8a896e908b5fae82a"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.604730 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicancd31-account-delete-l9h8d" event={"ID":"305f56cb-d896-435c-ae06-4a407714b503","Type":"ContainerDied","Data":"87310a359b34691e77d2310cf9bb176e0eebac4a7cd386913eb46acf7f2817cf"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.627487 4991 scope.go:117] "RemoveContainer" containerID="10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1" Oct 06 08:42:38 crc kubenswrapper[4991]: E1006 08:42:38.627983 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1\": container with ID starting with 10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1 not found: ID does not exist" containerID="10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.628028 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1"} err="failed to get container status \"10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1\": rpc error: code = NotFound desc = could not find container \"10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1\": container with ID starting with 10fba4ffeb9258e648c480a04448a4793a3e7583273d8349b70e18d50dadb2b1 not found: ID does not exist" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.628052 4991 scope.go:117] "RemoveContainer" containerID="d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e" Oct 06 08:42:38 crc kubenswrapper[4991]: E1006 08:42:38.628419 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e\": container with ID starting with d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e not found: ID does not exist" containerID="d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.628435 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e"} err="failed to get container status \"d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e\": rpc error: code = NotFound desc = could not find container \"d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e\": container with ID starting with d486a23aaf691458124bf3ce7261204f55edf0f5e85ebb0b639055228fe1101e not found: ID does not exist" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.628448 4991 scope.go:117] "RemoveContainer" containerID="ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.714191 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.726680 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.735914 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4t26\" (UniqueName: \"kubernetes.io/projected/ab7f3760-250c-4e34-8bde-7e9218b711ff-kube-api-access-m4t26\") pod \"ab7f3760-250c-4e34-8bde-7e9218b711ff\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.735960 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-combined-ca-bundle\") pod \"f4175b5d-7866-481a-a923-1ae5f3307195\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.742578 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab7f3760-250c-4e34-8bde-7e9218b711ff-logs\") pod \"ab7f3760-250c-4e34-8bde-7e9218b711ff\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.742628 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-nova-novncproxy-tls-certs\") pod \"f4175b5d-7866-481a-a923-1ae5f3307195\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.742678 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-config-data\") pod \"ab7f3760-250c-4e34-8bde-7e9218b711ff\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.742730 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lszfm\" (UniqueName: \"kubernetes.io/projected/f4175b5d-7866-481a-a923-1ae5f3307195-kube-api-access-lszfm\") pod \"f4175b5d-7866-481a-a923-1ae5f3307195\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.742809 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-config-data\") pod \"f4175b5d-7866-481a-a923-1ae5f3307195\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.742848 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-combined-ca-bundle\") pod \"ab7f3760-250c-4e34-8bde-7e9218b711ff\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.742916 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-vencrypt-tls-certs\") pod \"f4175b5d-7866-481a-a923-1ae5f3307195\" (UID: \"f4175b5d-7866-481a-a923-1ae5f3307195\") " Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.742974 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-config-data-custom\") pod \"ab7f3760-250c-4e34-8bde-7e9218b711ff\" (UID: \"ab7f3760-250c-4e34-8bde-7e9218b711ff\") " Oct 06 08:42:38 crc kubenswrapper[4991]: E1006 08:42:38.743663 4991 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 06 08:42:38 crc kubenswrapper[4991]: E1006 08:42:38.743708 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data podName:53c6aca4-4fd0-4d42-bbe2-4b6e91643503 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:42.743695414 +0000 UTC m=+1414.481445435 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data") pod "rabbitmq-server-0" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503") : configmap "rabbitmq-config-data" not found Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.749074 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab7f3760-250c-4e34-8bde-7e9218b711ff-logs" (OuterVolumeSpecName: "logs") pod "ab7f3760-250c-4e34-8bde-7e9218b711ff" (UID: "ab7f3760-250c-4e34-8bde-7e9218b711ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.751590 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab7f3760-250c-4e34-8bde-7e9218b711ff-kube-api-access-m4t26" (OuterVolumeSpecName: "kube-api-access-m4t26") pod "ab7f3760-250c-4e34-8bde-7e9218b711ff" (UID: "ab7f3760-250c-4e34-8bde-7e9218b711ff"). InnerVolumeSpecName "kube-api-access-m4t26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.759606 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4175b5d-7866-481a-a923-1ae5f3307195-kube-api-access-lszfm" (OuterVolumeSpecName: "kube-api-access-lszfm") pod "f4175b5d-7866-481a-a923-1ae5f3307195" (UID: "f4175b5d-7866-481a-a923-1ae5f3307195"). InnerVolumeSpecName "kube-api-access-lszfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.764892 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ab7f3760-250c-4e34-8bde-7e9218b711ff" (UID: "ab7f3760-250c-4e34-8bde-7e9218b711ff"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.780270 4991 scope.go:117] "RemoveContainer" containerID="ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb" Oct 06 08:42:38 crc kubenswrapper[4991]: E1006 08:42:38.782056 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb\": container with ID starting with ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb not found: ID does not exist" containerID="ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.782083 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb"} err="failed to get container status \"ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb\": rpc error: code = NotFound desc = could not find container \"ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb\": container with ID starting with ff8dd4cda091d6263b296df7c8b159650a4000eb7ad5465af849369663c4aedb not found: ID does not exist" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.782103 4991 scope.go:117] "RemoveContainer" containerID="b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.817666 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.817986 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="ceilometer-central-agent" containerID="cri-o://5649b7a21637b90f14805bd6598872f1dc5add87d3dbf7ebd86b1d9b50e3aaf8" gracePeriod=30 Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.820467 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="proxy-httpd" containerID="cri-o://f562f33940beff3b6675303b795947ca3b1c6407fc9bbc1512ece13bd67badb0" gracePeriod=30 Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.820587 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="sg-core" containerID="cri-o://0d50a1a51fb76b15129da6a601cd7086c571cfcb88c3e73e801b6de71d603ab5" gracePeriod=30 Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.820569 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="ceilometer-notification-agent" containerID="cri-o://73a68671b100b61405ad43b9f475805bafe221206d22cbb30d84e206645d9fff" gracePeriod=30 Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.877665 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.878047 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="fded3e15-f946-4f86-bed4-2c4a3262395a" containerName="kube-state-metrics" containerID="cri-o://2b0d0844a91b8badda9de344d7ec23d9d43da43f008821a4ea8b7ae982ffc991" gracePeriod=30 Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.878492 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4t26\" (UniqueName: \"kubernetes.io/projected/ab7f3760-250c-4e34-8bde-7e9218b711ff-kube-api-access-m4t26\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.879447 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab7f3760-250c-4e34-8bde-7e9218b711ff-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.879495 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lszfm\" (UniqueName: \"kubernetes.io/projected/f4175b5d-7866-481a-a923-1ae5f3307195-kube-api-access-lszfm\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.879515 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.951639 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance15db-account-delete-rpvcb" event={"ID":"50622552-6b5c-4af5-a457-09c526c54f3f","Type":"ContainerDied","Data":"4b1c4cb63acfc187b27cb2a3dfa633dfdc736b06e498d21d81a3c0e9609fb340"} Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.951673 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b1c4cb63acfc187b27cb2a3dfa633dfdc736b06e498d21d81a3c0e9609fb340" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.957697 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.970362 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 06 08:42:38 crc kubenswrapper[4991]: I1006 08:42:38.970560 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="033164fc-5a6f-4b9d-8c3a-1e4242078c9e" containerName="memcached" containerID="cri-o://beb72f2fe0b1d7a0ddb9249baead2d79de7b72973b4fde75ed6d9bc96c982e97" gracePeriod=30 Oct 06 08:42:38 crc kubenswrapper[4991]: E1006 08:42:38.983552 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05fd087fc4e56815232a45eddb3364d72ba9e9e329ba6d624cee180ef68e0693 is running failed: container process not found" containerID="05fd087fc4e56815232a45eddb3364d72ba9e9e329ba6d624cee180ef68e0693" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 08:42:39 crc kubenswrapper[4991]: E1006 08:42:38.995778 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05fd087fc4e56815232a45eddb3364d72ba9e9e329ba6d624cee180ef68e0693 is running failed: container process not found" containerID="05fd087fc4e56815232a45eddb3364d72ba9e9e329ba6d624cee180ef68e0693" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 08:42:39 crc kubenswrapper[4991]: E1006 08:42:38.999880 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05fd087fc4e56815232a45eddb3364d72ba9e9e329ba6d624cee180ef68e0693 is running failed: container process not found" containerID="05fd087fc4e56815232a45eddb3364d72ba9e9e329ba6d624cee180ef68e0693" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 08:42:39 crc kubenswrapper[4991]: E1006 08:42:38.999916 4991 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05fd087fc4e56815232a45eddb3364d72ba9e9e329ba6d624cee180ef68e0693 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="c5b53689-326b-4f4c-a625-beec7a3631fa" containerName="nova-cell1-conductor-conductor" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.028186 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-6qdfr"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.044186 4991 scope.go:117] "RemoveContainer" containerID="36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.044405 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.044876 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-6qdfr"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.044903 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75c547987d-brwwk" event={"ID":"ab7f3760-250c-4e34-8bde-7e9218b711ff","Type":"ContainerDied","Data":"a9a6952d5ef0cdd5ded874846bc65015cacc3696b9a2c51c948ab0619ed3b799"} Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.058380 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vwxnk"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.058654 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.076545 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementd82d-account-delete-jlr78" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.083786 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-79798cd5c5-jz6kb" event={"ID":"b2720ee8-eb06-4a0b-9bee-153b69ee769e","Type":"ContainerDied","Data":"c722d321ae9a78f88e82287a1583ca6ae7044e2e4bb3a86e94b4936e702b1a57"} Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.083884 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-79798cd5c5-jz6kb" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.085692 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-combined-ca-bundle\") pod \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.085733 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-config-data\") pod \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.085867 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwtt8\" (UniqueName: \"kubernetes.io/projected/b2720ee8-eb06-4a0b-9bee-153b69ee769e-kube-api-access-cwtt8\") pod \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.085951 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2720ee8-eb06-4a0b-9bee-153b69ee769e-logs\") pod \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.085979 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-config-data-custom\") pod \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\" (UID: \"b2720ee8-eb06-4a0b-9bee-153b69ee769e\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.090192 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2720ee8-eb06-4a0b-9bee-153b69ee769e-logs" (OuterVolumeSpecName: "logs") pod "b2720ee8-eb06-4a0b-9bee-153b69ee769e" (UID: "b2720ee8-eb06-4a0b-9bee-153b69ee769e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.094396 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2720ee8-eb06-4a0b-9bee-153b69ee769e-kube-api-access-cwtt8" (OuterVolumeSpecName: "kube-api-access-cwtt8") pod "b2720ee8-eb06-4a0b-9bee-153b69ee769e" (UID: "b2720ee8-eb06-4a0b-9bee-153b69ee769e"). InnerVolumeSpecName "kube-api-access-cwtt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.127066 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="815c282e-cc40-4ff8-b3f8-155d9a91a20b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.165:8776/healthcheck\": read tcp 10.217.0.2:33744->10.217.0.165:8776: read: connection reset by peer" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.139847 4991 generic.go:334] "Generic (PLEG): container finished" podID="aa57b1fb-c743-4137-9501-a0110f385b1c" containerID="9f7dc8083673fb521af061c8df5ca04354332444376a940728267b2a54832c2d" exitCode=0 Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.139896 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa57b1fb-c743-4137-9501-a0110f385b1c","Type":"ContainerDied","Data":"9f7dc8083673fb521af061c8df5ca04354332444376a940728267b2a54832c2d"} Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.143428 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.145520 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementd82d-account-delete-jlr78" event={"ID":"87faa73b-1148-48ae-88f4-3bdd06898658","Type":"ContainerDied","Data":"1f4913800a4fbe96c73ba606ca910bdb6f240e725720313f501e653b814e1009"} Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.145627 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementd82d-account-delete-jlr78" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.160633 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7pz2m"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.163389 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vwxnk"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.165051 4991 generic.go:334] "Generic (PLEG): container finished" podID="c5b53689-326b-4f4c-a625-beec7a3631fa" containerID="05fd087fc4e56815232a45eddb3364d72ba9e9e329ba6d624cee180ef68e0693" exitCode=0 Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.165124 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c5b53689-326b-4f4c-a625-beec7a3631fa","Type":"ContainerDied","Data":"05fd087fc4e56815232a45eddb3364d72ba9e9e329ba6d624cee180ef68e0693"} Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.168776 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b2720ee8-eb06-4a0b-9bee-153b69ee769e" (UID: "b2720ee8-eb06-4a0b-9bee-153b69ee769e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: E1006 08:42:39.169192 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ad30dfa_4735_4ef3_8fcc_4b6f25eefcd6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ad30dfa_4735_4ef3_8fcc_4b6f25eefcd6.slice/crio-557b7992f30f35d6f7674d6a98d19517cd682b267963b24a83c30d22b62d0339\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ad6d483_bca3_4391_9e4c_290b6b15b1f4.slice/crio-4ca1911997e9bbd3466058aaa12c187f5778061ad0ecd52fb889992119a4044f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9160ed8e_9be5_4d38_b9a0_7138dfecc506.slice/crio-0d50a1a51fb76b15129da6a601cd7086c571cfcb88c3e73e801b6de71d603ab5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d06311c_e246_4d3d_ba9c_388cb800ac4f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ad6d483_bca3_4391_9e4c_290b6b15b1f4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8e91b06_a3c1_41dc_b2f8_af738647ade8.slice/crio-e0f3942eca2775e8d394768a9a7631a254217c5c85c027947f1be02fccfec5a9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa57b1fb_c743_4137_9501_a0110f385b1c.slice/crio-conmon-9f7dc8083673fb521af061c8df5ca04354332444376a940728267b2a54832c2d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa57b1fb_c743_4137_9501_a0110f385b1c.slice/crio-9f7dc8083673fb521af061c8df5ca04354332444376a940728267b2a54832c2d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b135498_feb3_4024_b655_92f403f55bb9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b135498_feb3_4024_b655_92f403f55bb9.slice/crio-222efa71f1bf216a77d131176cd2facbb90b30333355df78c9608e6b61ee430c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8e91b06_a3c1_41dc_b2f8_af738647ade8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d06311c_e246_4d3d_ba9c_388cb800ac4f.slice/crio-8837db3f318960e4eaf4d8c389f5127148a69eb983ba046be127e70ea5f58f7d\": RecentStats: unable to find data in memory cache]" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.170642 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.176638 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7pz2m"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.181842 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronb842-account-delete-b9lmt" event={"ID":"4f1297ce-72cf-4b07-a66d-826e8e9c1663","Type":"ContainerDied","Data":"fb50fe6f781ad98b7568e7845a3eb4387b552f0fb9d542189b5c26f5d7d30f84"} Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.181950 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb50fe6f781ad98b7568e7845a3eb4387b552f0fb9d542189b5c26f5d7d30f84" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.192136 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-774597bb4-6c42q"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.192439 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-774597bb4-6c42q" podUID="79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" containerName="keystone-api" containerID="cri-o://0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0" gracePeriod=30 Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.193133 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-internal-tls-certs\") pod \"801bcc07-7874-4eb8-8447-40178d80ea09\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.193221 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801bcc07-7874-4eb8-8447-40178d80ea09-run-httpd\") pod \"801bcc07-7874-4eb8-8447-40178d80ea09\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.193259 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-combined-ca-bundle\") pod \"801bcc07-7874-4eb8-8447-40178d80ea09\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.193280 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvd6v\" (UniqueName: \"kubernetes.io/projected/87faa73b-1148-48ae-88f4-3bdd06898658-kube-api-access-jvd6v\") pod \"87faa73b-1148-48ae-88f4-3bdd06898658\" (UID: \"87faa73b-1148-48ae-88f4-3bdd06898658\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.193334 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801bcc07-7874-4eb8-8447-40178d80ea09-log-httpd\") pod \"801bcc07-7874-4eb8-8447-40178d80ea09\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.193384 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/801bcc07-7874-4eb8-8447-40178d80ea09-etc-swift\") pod \"801bcc07-7874-4eb8-8447-40178d80ea09\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.193424 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-config-data\") pod \"801bcc07-7874-4eb8-8447-40178d80ea09\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.193499 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-public-tls-certs\") pod \"801bcc07-7874-4eb8-8447-40178d80ea09\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.193515 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dkpc\" (UniqueName: \"kubernetes.io/projected/801bcc07-7874-4eb8-8447-40178d80ea09-kube-api-access-2dkpc\") pod \"801bcc07-7874-4eb8-8447-40178d80ea09\" (UID: \"801bcc07-7874-4eb8-8447-40178d80ea09\") " Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.194796 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/801bcc07-7874-4eb8-8447-40178d80ea09-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "801bcc07-7874-4eb8-8447-40178d80ea09" (UID: "801bcc07-7874-4eb8-8447-40178d80ea09"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.195303 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwtt8\" (UniqueName: \"kubernetes.io/projected/b2720ee8-eb06-4a0b-9bee-153b69ee769e-kube-api-access-cwtt8\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.195318 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2720ee8-eb06-4a0b-9bee-153b69ee769e-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.195327 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.195335 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801bcc07-7874-4eb8-8447-40178d80ea09-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.196673 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/801bcc07-7874-4eb8-8447-40178d80ea09-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "801bcc07-7874-4eb8-8447-40178d80ea09" (UID: "801bcc07-7874-4eb8-8447-40178d80ea09"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.199665 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.203069 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicancd31-account-delete-l9h8d" event={"ID":"305f56cb-d896-435c-ae06-4a407714b503","Type":"ContainerDied","Data":"e07b6d6d9bc08140ce4a9c1fb9f8b41eeb82670ae60b440d3e025e27bc9e9909"} Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.203115 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e07b6d6d9bc08140ce4a9c1fb9f8b41eeb82670ae60b440d3e025e27bc9e9909" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.209832 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801bcc07-7874-4eb8-8447-40178d80ea09-kube-api-access-2dkpc" (OuterVolumeSpecName: "kube-api-access-2dkpc") pod "801bcc07-7874-4eb8-8447-40178d80ea09" (UID: "801bcc07-7874-4eb8-8447-40178d80ea09"). InnerVolumeSpecName "kube-api-access-2dkpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.210456 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.211053 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87faa73b-1148-48ae-88f4-3bdd06898658-kube-api-access-jvd6v" (OuterVolumeSpecName: "kube-api-access-jvd6v") pod "87faa73b-1148-48ae-88f4-3bdd06898658" (UID: "87faa73b-1148-48ae-88f4-3bdd06898658"). InnerVolumeSpecName "kube-api-access-jvd6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.211270 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-856f6664f9-gqcn7" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.211789 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-856f6664f9-gqcn7" event={"ID":"801bcc07-7874-4eb8-8447-40178d80ea09","Type":"ContainerDied","Data":"9dcfd2196f73e6739092d1903de65b86b03c7b97db9d23f6b170f3f95c13a843"} Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.221959 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.228931 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-df86g"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.229212 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801bcc07-7874-4eb8-8447-40178d80ea09-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "801bcc07-7874-4eb8-8447-40178d80ea09" (UID: "801bcc07-7874-4eb8-8447-40178d80ea09"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.229578 4991 generic.go:334] "Generic (PLEG): container finished" podID="d1a24973-6ef6-4732-9a96-040ce646a707" containerID="4fbfc2abb485c8ccd9560493a1360ee31985544c8877ac7b1baa4f76139308c7" exitCode=0 Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.229632 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d1a24973-6ef6-4732-9a96-040ce646a707","Type":"ContainerDied","Data":"4fbfc2abb485c8ccd9560493a1360ee31985544c8877ac7b1baa4f76139308c7"} Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.233422 4991 generic.go:334] "Generic (PLEG): container finished" podID="157f3f65-3397-4a2d-98ea-1ae5897c7a76" containerID="94c589983290634c76235daa1990cab452138af9c99951302ddc413d46fc20a4" exitCode=0 Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.233510 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"157f3f65-3397-4a2d-98ea-1ae5897c7a76","Type":"ContainerDied","Data":"94c589983290634c76235daa1990cab452138af9c99951302ddc413d46fc20a4"} Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.233536 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"157f3f65-3397-4a2d-98ea-1ae5897c7a76","Type":"ContainerDied","Data":"19bfa8422af3ebc6a59245353778ba8be20c0c90fd0573ce41a887263f3cbe62"} Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.233549 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19bfa8422af3ebc6a59245353778ba8be20c0c90fd0573ce41a887263f3cbe62" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.235742 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-df86g"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.242050 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4qt96"] Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.251334 4991 generic.go:334] "Generic (PLEG): container finished" podID="feb6a9a7-403e-4dc9-903c-349391d84efb" containerID="fb10a7813dbd9816182db52dd9a3b501c4c766c110a2193adb6f6007214cdc4f" exitCode=0 Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.251419 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.270585 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6" path="/var/lib/kubelet/pods/0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6/volumes" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.274626 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b135498-feb3-4024-b655-92f403f55bb9" path="/var/lib/kubelet/pods/1b135498-feb3-4024-b655-92f403f55bb9/volumes" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.275536 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d06311c-e246-4d3d-ba9c-388cb800ac4f" path="/var/lib/kubelet/pods/2d06311c-e246-4d3d-ba9c-388cb800ac4f/volumes" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.276223 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f3aef9d-9026-440f-a163-c1caaefb69a3" path="/var/lib/kubelet/pods/2f3aef9d-9026-440f-a163-c1caaefb69a3/volumes" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.277377 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ad6d483-bca3-4391-9e4c-290b6b15b1f4" path="/var/lib/kubelet/pods/6ad6d483-bca3-4391-9e4c-290b6b15b1f4/volumes" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.278265 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0dc26d4-eaf8-4419-a5e9-82e40496890b" path="/var/lib/kubelet/pods/c0dc26d4-eaf8-4419-a5e9-82e40496890b/volumes" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.278787 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8e91b06-a3c1-41dc-b2f8-af738647ade8" path="/var/lib/kubelet/pods/e8e91b06-a3c1-41dc-b2f8-af738647ade8/volumes" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.298691 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvd6v\" (UniqueName: \"kubernetes.io/projected/87faa73b-1148-48ae-88f4-3bdd06898658-kube-api-access-jvd6v\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.298724 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801bcc07-7874-4eb8-8447-40178d80ea09-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.298734 4991 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/801bcc07-7874-4eb8-8447-40178d80ea09-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.298743 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dkpc\" (UniqueName: \"kubernetes.io/projected/801bcc07-7874-4eb8-8447-40178d80ea09-kube-api-access-2dkpc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.379930 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7988dccf5c-j9ll7" podUID="0a6703e0-1fac-4734-98ac-88f6163fdaae" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.538484 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": read tcp 10.217.0.2:50172->10.217.0.201:8775: read: connection reset by peer" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.538502 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": read tcp 10.217.0.2:50182->10.217.0.201:8775: read: connection reset by peer" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.592891 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2720ee8-eb06-4a0b-9bee-153b69ee769e" (UID: "b2720ee8-eb06-4a0b-9bee-153b69ee769e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.608130 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.660735 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4175b5d-7866-481a-a923-1ae5f3307195" (UID: "f4175b5d-7866-481a-a923-1ae5f3307195"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.689118 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "801bcc07-7874-4eb8-8447-40178d80ea09" (UID: "801bcc07-7874-4eb8-8447-40178d80ea09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.710205 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.710231 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.710747 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-548cc795f4-8m4d9" podUID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.154:9311/healthcheck\": dial tcp 10.217.0.154:9311: connect: connection refused" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.710955 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-548cc795f4-8m4d9" podUID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.154:9311/healthcheck\": dial tcp 10.217.0.154:9311: connect: connection refused" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.711735 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-config-data" (OuterVolumeSpecName: "config-data") pod "b2720ee8-eb06-4a0b-9bee-153b69ee769e" (UID: "b2720ee8-eb06-4a0b-9bee-153b69ee769e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.735417 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab7f3760-250c-4e34-8bde-7e9218b711ff" (UID: "ab7f3760-250c-4e34-8bde-7e9218b711ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.765466 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-config-data" (OuterVolumeSpecName: "config-data") pod "ab7f3760-250c-4e34-8bde-7e9218b711ff" (UID: "ab7f3760-250c-4e34-8bde-7e9218b711ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.768002 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-config-data" (OuterVolumeSpecName: "config-data") pod "f4175b5d-7866-481a-a923-1ae5f3307195" (UID: "f4175b5d-7866-481a-a923-1ae5f3307195"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.780557 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-config-data" (OuterVolumeSpecName: "config-data") pod "801bcc07-7874-4eb8-8447-40178d80ea09" (UID: "801bcc07-7874-4eb8-8447-40178d80ea09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.785518 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "f4175b5d-7866-481a-a923-1ae5f3307195" (UID: "f4175b5d-7866-481a-a923-1ae5f3307195"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.790440 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "f4175b5d-7866-481a-a923-1ae5f3307195" (UID: "f4175b5d-7866-481a-a923-1ae5f3307195"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.811937 4991 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.811973 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.812004 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.812012 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.812021 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7f3760-250c-4e34-8bde-7e9218b711ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.812029 4991 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4175b5d-7866-481a-a923-1ae5f3307195-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.812038 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2720ee8-eb06-4a0b-9bee-153b69ee769e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.826242 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="d1f986ad-8a8d-44d3-b200-479a60f8b8b3" containerName="galera" containerID="cri-o://9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc" gracePeriod=30 Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.828686 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "801bcc07-7874-4eb8-8447-40178d80ea09" (UID: "801bcc07-7874-4eb8-8447-40178d80ea09"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.835881 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "801bcc07-7874-4eb8-8447-40178d80ea09" (UID: "801bcc07-7874-4eb8-8447-40178d80ea09"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.916645 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:39 crc kubenswrapper[4991]: I1006 08:42:39.917061 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/801bcc07-7874-4eb8-8447-40178d80ea09-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.264496 4991 generic.go:334] "Generic (PLEG): container finished" podID="815c282e-cc40-4ff8-b3f8-155d9a91a20b" containerID="4e08aae5f1f3064fd06a75855d7641f5f9a9574da5cd200704d0371193acd2b3" exitCode=0 Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.266160 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapi9279-account-delete-bsk7x" podUID="2b01de4c-42f4-4928-916a-6a9638340718" containerName="mariadb-account-delete" containerID="cri-o://5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085" gracePeriod=30 Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.272061 4991 generic.go:334] "Generic (PLEG): container finished" podID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerID="6a0dd291d385b5c827db71bd9cfc93863a8619475a3e50f8d7d1d405394842f9" exitCode=0 Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.279047 4991 generic.go:334] "Generic (PLEG): container finished" podID="23e696d7-7767-4a92-9828-a189ffb52275" containerID="98d4fcdfdc9774dff4624bf92e206f1e36780461435c0e70b7a79655aa1bd813" exitCode=0 Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.281505 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi9279-account-delete-bsk7x" podStartSLOduration=6.281494497 podStartE2EDuration="6.281494497s" podCreationTimestamp="2025-10-06 08:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:40.279372325 +0000 UTC m=+1412.017122356" watchObservedRunningTime="2025-10-06 08:42:40.281494497 +0000 UTC m=+1412.019244518" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.289013 4991 generic.go:334] "Generic (PLEG): container finished" podID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerID="f562f33940beff3b6675303b795947ca3b1c6407fc9bbc1512ece13bd67badb0" exitCode=0 Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.289049 4991 generic.go:334] "Generic (PLEG): container finished" podID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerID="0d50a1a51fb76b15129da6a601cd7086c571cfcb88c3e73e801b6de71d603ab5" exitCode=2 Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.289059 4991 generic.go:334] "Generic (PLEG): container finished" podID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerID="5649b7a21637b90f14805bd6598872f1dc5add87d3dbf7ebd86b1d9b50e3aaf8" exitCode=0 Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.301948 4991 generic.go:334] "Generic (PLEG): container finished" podID="033164fc-5a6f-4b9d-8c3a-1e4242078c9e" containerID="beb72f2fe0b1d7a0ddb9249baead2d79de7b72973b4fde75ed6d9bc96c982e97" exitCode=0 Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.303913 4991 generic.go:334] "Generic (PLEG): container finished" podID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" containerID="0c70b35b1a4450b4db02a166e4cb0db2437a7fcc554b453e7d86b3f8efc7685d" exitCode=0 Oct 06 08:42:40 crc kubenswrapper[4991]: E1006 08:42:40.305584 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 06 08:42:40 crc kubenswrapper[4991]: E1006 08:42:40.306530 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.306735 4991 generic.go:334] "Generic (PLEG): container finished" podID="f2791937-a79f-4d99-b895-6d3ac79ba220" containerID="cba219b856cd8afbdba07fd20346d5c2d98eb76e2dd1e02ef4b5e55c609acd84" exitCode=1 Oct 06 08:42:40 crc kubenswrapper[4991]: E1006 08:42:40.307721 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 06 08:42:40 crc kubenswrapper[4991]: E1006 08:42:40.307743 4991 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="51a7066c-5143-43ab-b642-81f461a9c1f4" containerName="ovn-northd" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.318569 4991 generic.go:334] "Generic (PLEG): container finished" podID="fded3e15-f946-4f86-bed4-2c4a3262395a" containerID="2b0d0844a91b8badda9de344d7ec23d9d43da43f008821a4ea8b7ae982ffc991" exitCode=2 Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.320261 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinderb589-account-delete-h8q45" podUID="7d3b515a-b48d-48f7-8775-a0299e07f231" containerName="mariadb-account-delete" containerID="cri-o://156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b" gracePeriod=30 Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.331130 4991 generic.go:334] "Generic (PLEG): container finished" podID="544d772e-ee45-4bd6-9895-07dec1dc3ff1" containerID="a69224037bb7a4b312d1895a54ffafcd696e1071bc2b1bfdd04a50ad1263701f" exitCode=0 Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.349971 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinderb589-account-delete-h8q45" podStartSLOduration=6.349949593 podStartE2EDuration="6.349949593s" podCreationTimestamp="2025-10-06 08:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:42:40.342839777 +0000 UTC m=+1412.080589798" watchObservedRunningTime="2025-10-06 08:42:40.349949593 +0000 UTC m=+1412.087699614" Oct 06 08:42:40 crc kubenswrapper[4991]: E1006 08:42:40.469029 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98 is running failed: container process not found" containerID="758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 08:42:40 crc kubenswrapper[4991]: E1006 08:42:40.469471 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98 is running failed: container process not found" containerID="758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 08:42:40 crc kubenswrapper[4991]: E1006 08:42:40.469830 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98 is running failed: container process not found" containerID="758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 06 08:42:40 crc kubenswrapper[4991]: E1006 08:42:40.469877 4991 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="697548ef-9b89-4827-a5f1-4e535ae94722" containerName="nova-cell0-conductor-conductor" Oct 06 08:42:40 crc kubenswrapper[4991]: E1006 08:42:40.527572 4991 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.285s" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529654 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b98fcbb5b-2m256" event={"ID":"feb6a9a7-403e-4dc9-903c-349391d84efb","Type":"ContainerDied","Data":"fb10a7813dbd9816182db52dd9a3b501c4c766c110a2193adb6f6007214cdc4f"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529697 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4qt96"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529729 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1358-account-create-tthq2"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529743 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b98fcbb5b-2m256" event={"ID":"feb6a9a7-403e-4dc9-903c-349391d84efb","Type":"ContainerDied","Data":"f68fcaa96a538101b5a2515f8d7dbd7ca1052eab89f2d5473e684b8fded6fc0d"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529756 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f68fcaa96a538101b5a2515f8d7dbd7ca1052eab89f2d5473e684b8fded6fc0d" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529770 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1358-account-create-tthq2"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529791 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"815c282e-cc40-4ff8-b3f8-155d9a91a20b","Type":"ContainerDied","Data":"4e08aae5f1f3064fd06a75855d7641f5f9a9574da5cd200704d0371193acd2b3"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529808 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"815c282e-cc40-4ff8-b3f8-155d9a91a20b","Type":"ContainerDied","Data":"6007afbe8efe47aa81a20c944be444478bb0924cc26fd218c84bb8b618e51bb4"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529820 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6007afbe8efe47aa81a20c944be444478bb0924cc26fd218c84bb8b618e51bb4" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529830 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi9279-account-delete-bsk7x" event={"ID":"2b01de4c-42f4-4928-916a-6a9638340718","Type":"ContainerStarted","Data":"5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529842 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70e2b1c5-03aa-4472-9002-7daf936edc67","Type":"ContainerDied","Data":"6a0dd291d385b5c827db71bd9cfc93863a8619475a3e50f8d7d1d405394842f9"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529863 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70e2b1c5-03aa-4472-9002-7daf936edc67","Type":"ContainerDied","Data":"a48db5685f7377e9be4ea3f517179ff3b19dee2b7baad29e35463d0a8cdeb6c6"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529886 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a48db5685f7377e9be4ea3f517179ff3b19dee2b7baad29e35463d0a8cdeb6c6" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529906 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23e696d7-7767-4a92-9828-a189ffb52275","Type":"ContainerDied","Data":"98d4fcdfdc9774dff4624bf92e206f1e36780461435c0e70b7a79655aa1bd813"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529923 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"23e696d7-7767-4a92-9828-a189ffb52275","Type":"ContainerDied","Data":"858b1c9988c6c99d9be9614393b9d3088930a19a6747938253299bfda4441743"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529934 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="858b1c9988c6c99d9be9614393b9d3088930a19a6747938253299bfda4441743" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529944 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9160ed8e-9be5-4d38-b9a0-7138dfecc506","Type":"ContainerDied","Data":"f562f33940beff3b6675303b795947ca3b1c6407fc9bbc1512ece13bd67badb0"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529958 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9160ed8e-9be5-4d38-b9a0-7138dfecc506","Type":"ContainerDied","Data":"0d50a1a51fb76b15129da6a601cd7086c571cfcb88c3e73e801b6de71d603ab5"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529969 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9160ed8e-9be5-4d38-b9a0-7138dfecc506","Type":"ContainerDied","Data":"5649b7a21637b90f14805bd6598872f1dc5add87d3dbf7ebd86b1d9b50e3aaf8"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529981 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c5b53689-326b-4f4c-a625-beec7a3631fa","Type":"ContainerDied","Data":"be2c98bbc96939e542bf23df0e5cf44de8fa28b270045204096113d0a382a945"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.529993 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be2c98bbc96939e542bf23df0e5cf44de8fa28b270045204096113d0a382a945" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530002 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d1a24973-6ef6-4732-9a96-040ce646a707","Type":"ContainerDied","Data":"7df79192d6f34a90ebfd5f7c39a7e4dc001ad60b7b016b0f55c1a72ada7c789b"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530014 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7df79192d6f34a90ebfd5f7c39a7e4dc001ad60b7b016b0f55c1a72ada7c789b" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530024 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"033164fc-5a6f-4b9d-8c3a-1e4242078c9e","Type":"ContainerDied","Data":"beb72f2fe0b1d7a0ddb9249baead2d79de7b72973b4fde75ed6d9bc96c982e97"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530036 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-548cc795f4-8m4d9" event={"ID":"a9be32ba-d183-4fd5-ba8b-63f79c973c81","Type":"ContainerDied","Data":"0c70b35b1a4450b4db02a166e4cb0db2437a7fcc554b453e7d86b3f8efc7685d"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530050 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-548cc795f4-8m4d9" event={"ID":"a9be32ba-d183-4fd5-ba8b-63f79c973c81","Type":"ContainerDied","Data":"773d7d55a52709c57ac795cd9a5c5b4cc33c328c660fb36301f52c84e195de8a"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530060 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="773d7d55a52709c57ac795cd9a5c5b4cc33c328c660fb36301f52c84e195de8a" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530070 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1ea2a-account-delete-5smxw" event={"ID":"f2791937-a79f-4d99-b895-6d3ac79ba220","Type":"ContainerDied","Data":"cba219b856cd8afbdba07fd20346d5c2d98eb76e2dd1e02ef4b5e55c609acd84"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530082 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fded3e15-f946-4f86-bed4-2c4a3262395a","Type":"ContainerDied","Data":"2b0d0844a91b8badda9de344d7ec23d9d43da43f008821a4ea8b7ae982ffc991"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530095 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fded3e15-f946-4f86-bed4-2c4a3262395a","Type":"ContainerDied","Data":"dedb648723c43b1425e3d19a451a68998cd390e07ca52163680413cc97987dad"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530105 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dedb648723c43b1425e3d19a451a68998cd390e07ca52163680413cc97987dad" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530114 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderb589-account-delete-h8q45" event={"ID":"7d3b515a-b48d-48f7-8775-a0299e07f231","Type":"ContainerStarted","Data":"156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530126 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bc2q" event={"ID":"544d772e-ee45-4bd6-9895-07dec1dc3ff1","Type":"ContainerDied","Data":"a69224037bb7a4b312d1895a54ffafcd696e1071bc2b1bfdd04a50ad1263701f"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530140 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"aa57b1fb-c743-4137-9501-a0110f385b1c","Type":"ContainerDied","Data":"9230720e81af75feaabd54c331218446399e1d16122ab695c8e393b09db8037b"} Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.530152 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9230720e81af75feaabd54c331218446399e1d16122ab695c8e393b09db8037b" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.552087 4991 scope.go:117] "RemoveContainer" containerID="b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9" Oct 06 08:42:40 crc kubenswrapper[4991]: E1006 08:42:40.553455 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9\": container with ID starting with b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9 not found: ID does not exist" containerID="b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.555436 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance15db-account-delete-rpvcb" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.556273 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.553546 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9"} err="failed to get container status \"b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9\": rpc error: code = NotFound desc = could not find container \"b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9\": container with ID starting with b79737a793ce65fe47127b1cd80cbe9ab8fca0f3e977e71eec7b3ac7b3b5f4b9 not found: ID does not exist" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.556442 4991 scope.go:117] "RemoveContainer" containerID="36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b" Oct 06 08:42:40 crc kubenswrapper[4991]: E1006 08:42:40.556992 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b\": container with ID starting with 36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b not found: ID does not exist" containerID="36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.557056 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b"} err="failed to get container status \"36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b\": rpc error: code = NotFound desc = could not find container \"36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b\": container with ID starting with 36af0fd1b0ff2c2881c5162925289809f0f97c9bff4bd5962aa2e915f1cc914b not found: ID does not exist" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.557138 4991 scope.go:117] "RemoveContainer" containerID="81c13c33c57ac2a7fafdd527f3ce9a1ddf23d76bafb0a388ddb5af43c282a8b4" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.582492 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicancd31-account-delete-l9h8d" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.603520 4991 scope.go:117] "RemoveContainer" containerID="4410132c0aa760f431a4973b217154eb03f1a1acbcd426c9c298f0ff9b2290ca" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.656482 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4wnb\" (UniqueName: \"kubernetes.io/projected/157f3f65-3397-4a2d-98ea-1ae5897c7a76-kube-api-access-h4wnb\") pod \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.656713 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-combined-ca-bundle\") pod \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.657735 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-operator-scripts\") pod \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.657760 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-secrets\") pod \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.658687 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "157f3f65-3397-4a2d-98ea-1ae5897c7a76" (UID: "157f3f65-3397-4a2d-98ea-1ae5897c7a76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.659350 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.663612 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/157f3f65-3397-4a2d-98ea-1ae5897c7a76-kube-api-access-h4wnb" (OuterVolumeSpecName: "kube-api-access-h4wnb") pod "157f3f65-3397-4a2d-98ea-1ae5897c7a76" (UID: "157f3f65-3397-4a2d-98ea-1ae5897c7a76"). InnerVolumeSpecName "kube-api-access-h4wnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.680871 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-856f6664f9-gqcn7"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.708029 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-secrets" (OuterVolumeSpecName: "secrets") pod "157f3f65-3397-4a2d-98ea-1ae5897c7a76" (UID: "157f3f65-3397-4a2d-98ea-1ae5897c7a76"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.719549 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronb842-account-delete-b9lmt" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.720172 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.721189 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-856f6664f9-gqcn7"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.736727 4991 scope.go:117] "RemoveContainer" containerID="118c1de5d5587e621349f796c4342c648e5951d232a1d0e3dcb3fd1f0b4f7705" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.737549 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-75c547987d-brwwk"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.738202 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.748679 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-75c547987d-brwwk"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760311 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1a24973-6ef6-4732-9a96-040ce646a707-httpd-run\") pod \"d1a24973-6ef6-4732-9a96-040ce646a707\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760378 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jd6h\" (UniqueName: \"kubernetes.io/projected/50622552-6b5c-4af5-a457-09c526c54f3f-kube-api-access-5jd6h\") pod \"50622552-6b5c-4af5-a457-09c526c54f3f\" (UID: \"50622552-6b5c-4af5-a457-09c526c54f3f\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760405 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-kolla-config\") pod \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760443 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760470 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-scripts\") pod \"d1a24973-6ef6-4732-9a96-040ce646a707\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760495 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg7vn\" (UniqueName: \"kubernetes.io/projected/4f1297ce-72cf-4b07-a66d-826e8e9c1663-kube-api-access-zg7vn\") pod \"4f1297ce-72cf-4b07-a66d-826e8e9c1663\" (UID: \"4f1297ce-72cf-4b07-a66d-826e8e9c1663\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760521 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-galera-tls-certs\") pod \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760547 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-config-data\") pod \"feb6a9a7-403e-4dc9-903c-349391d84efb\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760572 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feb6a9a7-403e-4dc9-903c-349391d84efb-logs\") pod \"feb6a9a7-403e-4dc9-903c-349391d84efb\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760620 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-public-tls-certs\") pod \"feb6a9a7-403e-4dc9-903c-349391d84efb\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760648 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6bxl\" (UniqueName: \"kubernetes.io/projected/305f56cb-d896-435c-ae06-4a407714b503-kube-api-access-w6bxl\") pod \"305f56cb-d896-435c-ae06-4a407714b503\" (UID: \"305f56cb-d896-435c-ae06-4a407714b503\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760678 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-combined-ca-bundle\") pod \"d1a24973-6ef6-4732-9a96-040ce646a707\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760708 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-config-data-default\") pod \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760737 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-config-data\") pod \"d1a24973-6ef6-4732-9a96-040ce646a707\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760761 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-public-tls-certs\") pod \"d1a24973-6ef6-4732-9a96-040ce646a707\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760800 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-combined-ca-bundle\") pod \"feb6a9a7-403e-4dc9-903c-349391d84efb\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760828 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1a24973-6ef6-4732-9a96-040ce646a707-logs\") pod \"d1a24973-6ef6-4732-9a96-040ce646a707\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760861 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjp9s\" (UniqueName: \"kubernetes.io/projected/d1a24973-6ef6-4732-9a96-040ce646a707-kube-api-access-tjp9s\") pod \"d1a24973-6ef6-4732-9a96-040ce646a707\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760885 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-scripts\") pod \"feb6a9a7-403e-4dc9-903c-349391d84efb\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760908 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-internal-tls-certs\") pod \"feb6a9a7-403e-4dc9-903c-349391d84efb\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760928 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d1a24973-6ef6-4732-9a96-040ce646a707\" (UID: \"d1a24973-6ef6-4732-9a96-040ce646a707\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.760967 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdf7w\" (UniqueName: \"kubernetes.io/projected/feb6a9a7-403e-4dc9-903c-349391d84efb-kube-api-access-xdf7w\") pod \"feb6a9a7-403e-4dc9-903c-349391d84efb\" (UID: \"feb6a9a7-403e-4dc9-903c-349391d84efb\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.761003 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/157f3f65-3397-4a2d-98ea-1ae5897c7a76-config-data-generated\") pod \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\" (UID: \"157f3f65-3397-4a2d-98ea-1ae5897c7a76\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.761231 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.762134 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1a24973-6ef6-4732-9a96-040ce646a707-logs" (OuterVolumeSpecName: "logs") pod "d1a24973-6ef6-4732-9a96-040ce646a707" (UID: "d1a24973-6ef6-4732-9a96-040ce646a707"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.765175 4991 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.765200 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4wnb\" (UniqueName: \"kubernetes.io/projected/157f3f65-3397-4a2d-98ea-1ae5897c7a76-kube-api-access-h4wnb\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.765214 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1a24973-6ef6-4732-9a96-040ce646a707-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.775695 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305f56cb-d896-435c-ae06-4a407714b503-kube-api-access-w6bxl" (OuterVolumeSpecName: "kube-api-access-w6bxl") pod "305f56cb-d896-435c-ae06-4a407714b503" (UID: "305f56cb-d896-435c-ae06-4a407714b503"). InnerVolumeSpecName "kube-api-access-w6bxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.780898 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1a24973-6ef6-4732-9a96-040ce646a707-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d1a24973-6ef6-4732-9a96-040ce646a707" (UID: "d1a24973-6ef6-4732-9a96-040ce646a707"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.782569 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.784664 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementd82d-account-delete-jlr78"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.784698 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementd82d-account-delete-jlr78"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.797434 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.797473 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.797485 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-79798cd5c5-jz6kb"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.804510 4991 scope.go:117] "RemoveContainer" containerID="d85323f86704585d0954acacab967e959246d8405dc02badbe6e793e45cbe71b" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.805271 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feb6a9a7-403e-4dc9-903c-349391d84efb-logs" (OuterVolumeSpecName: "logs") pod "feb6a9a7-403e-4dc9-903c-349391d84efb" (UID: "feb6a9a7-403e-4dc9-903c-349391d84efb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.805752 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-79798cd5c5-jz6kb"] Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.805807 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "157f3f65-3397-4a2d-98ea-1ae5897c7a76" (UID: "157f3f65-3397-4a2d-98ea-1ae5897c7a76"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.807279 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.807951 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "157f3f65-3397-4a2d-98ea-1ae5897c7a76" (UID: "157f3f65-3397-4a2d-98ea-1ae5897c7a76"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.807985 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157f3f65-3397-4a2d-98ea-1ae5897c7a76-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "157f3f65-3397-4a2d-98ea-1ae5897c7a76" (UID: "157f3f65-3397-4a2d-98ea-1ae5897c7a76"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.812426 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-scripts" (OuterVolumeSpecName: "scripts") pod "d1a24973-6ef6-4732-9a96-040ce646a707" (UID: "d1a24973-6ef6-4732-9a96-040ce646a707"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.812619 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "d1a24973-6ef6-4732-9a96-040ce646a707" (UID: "d1a24973-6ef6-4732-9a96-040ce646a707"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.821500 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb6a9a7-403e-4dc9-903c-349391d84efb-kube-api-access-xdf7w" (OuterVolumeSpecName: "kube-api-access-xdf7w") pod "feb6a9a7-403e-4dc9-903c-349391d84efb" (UID: "feb6a9a7-403e-4dc9-903c-349391d84efb"). InnerVolumeSpecName "kube-api-access-xdf7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.822874 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.824004 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f1297ce-72cf-4b07-a66d-826e8e9c1663-kube-api-access-zg7vn" (OuterVolumeSpecName: "kube-api-access-zg7vn") pod "4f1297ce-72cf-4b07-a66d-826e8e9c1663" (UID: "4f1297ce-72cf-4b07-a66d-826e8e9c1663"). InnerVolumeSpecName "kube-api-access-zg7vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.833135 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.835554 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.838067 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50622552-6b5c-4af5-a457-09c526c54f3f-kube-api-access-5jd6h" (OuterVolumeSpecName: "kube-api-access-5jd6h") pod "50622552-6b5c-4af5-a457-09c526c54f3f" (UID: "50622552-6b5c-4af5-a457-09c526c54f3f"). InnerVolumeSpecName "kube-api-access-5jd6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.838818 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a24973-6ef6-4732-9a96-040ce646a707-kube-api-access-tjp9s" (OuterVolumeSpecName: "kube-api-access-tjp9s") pod "d1a24973-6ef6-4732-9a96-040ce646a707" (UID: "d1a24973-6ef6-4732-9a96-040ce646a707"). InnerVolumeSpecName "kube-api-access-tjp9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.843258 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.853659 4991 scope.go:117] "RemoveContainer" containerID="6cff5f4fbbdd1f8906fc62c1c37a25292d4484752ab929ce099738e8a4117501" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.856141 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.858190 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-scripts" (OuterVolumeSpecName: "scripts") pod "feb6a9a7-403e-4dc9-903c-349391d84efb" (UID: "feb6a9a7-403e-4dc9-903c-349391d84efb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.859586 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1a24973-6ef6-4732-9a96-040ce646a707" (UID: "d1a24973-6ef6-4732-9a96-040ce646a707"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.862267 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1ea2a-account-delete-5smxw" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.863495 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "157f3f65-3397-4a2d-98ea-1ae5897c7a76" (UID: "157f3f65-3397-4a2d-98ea-1ae5897c7a76"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866456 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-config-data\") pod \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866506 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/815c282e-cc40-4ff8-b3f8-155d9a91a20b-etc-machine-id\") pod \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866536 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-internal-tls-certs\") pod \"23e696d7-7767-4a92-9828-a189ffb52275\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866566 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b53689-326b-4f4c-a625-beec7a3631fa-combined-ca-bundle\") pod \"c5b53689-326b-4f4c-a625-beec7a3631fa\" (UID: \"c5b53689-326b-4f4c-a625-beec7a3631fa\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866612 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e2b1c5-03aa-4472-9002-7daf936edc67-logs\") pod \"70e2b1c5-03aa-4472-9002-7daf936edc67\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866636 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-config-data\") pod \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866658 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfnrn\" (UniqueName: \"kubernetes.io/projected/c5b53689-326b-4f4c-a625-beec7a3631fa-kube-api-access-xfnrn\") pod \"c5b53689-326b-4f4c-a625-beec7a3631fa\" (UID: \"c5b53689-326b-4f4c-a625-beec7a3631fa\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866681 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6dk6\" (UniqueName: \"kubernetes.io/projected/aa57b1fb-c743-4137-9501-a0110f385b1c-kube-api-access-t6dk6\") pod \"aa57b1fb-c743-4137-9501-a0110f385b1c\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866698 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-memcached-tls-certs\") pod \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866713 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b53689-326b-4f4c-a625-beec7a3631fa-config-data\") pod \"c5b53689-326b-4f4c-a625-beec7a3631fa\" (UID: \"c5b53689-326b-4f4c-a625-beec7a3631fa\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866737 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-nova-metadata-tls-certs\") pod \"70e2b1c5-03aa-4472-9002-7daf936edc67\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866761 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-config-data\") pod \"23e696d7-7767-4a92-9828-a189ffb52275\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866791 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtxm2\" (UniqueName: \"kubernetes.io/projected/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-kube-api-access-mtxm2\") pod \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866826 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb6nm\" (UniqueName: \"kubernetes.io/projected/815c282e-cc40-4ff8-b3f8-155d9a91a20b-kube-api-access-zb6nm\") pod \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866856 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-combined-ca-bundle\") pod \"fded3e15-f946-4f86-bed4-2c4a3262395a\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866954 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-config-data\") pod \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.866979 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-state-metrics-tls-certs\") pod \"fded3e15-f946-4f86-bed4-2c4a3262395a\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867050 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9be32ba-d183-4fd5-ba8b-63f79c973c81-logs\") pod \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867072 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vg9r\" (UniqueName: \"kubernetes.io/projected/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-api-access-4vg9r\") pod \"fded3e15-f946-4f86-bed4-2c4a3262395a\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867128 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-public-tls-certs\") pod \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867155 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-combined-ca-bundle\") pod \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867251 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/815c282e-cc40-4ff8-b3f8-155d9a91a20b-logs\") pod \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867317 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-combined-ca-bundle\") pod \"23e696d7-7767-4a92-9828-a189ffb52275\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867340 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-scripts\") pod \"aa57b1fb-c743-4137-9501-a0110f385b1c\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867398 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-internal-tls-certs\") pod \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867422 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa57b1fb-c743-4137-9501-a0110f385b1c-httpd-run\") pod \"aa57b1fb-c743-4137-9501-a0110f385b1c\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867470 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jdfx\" (UniqueName: \"kubernetes.io/projected/23e696d7-7767-4a92-9828-a189ffb52275-kube-api-access-7jdfx\") pod \"23e696d7-7767-4a92-9828-a189ffb52275\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867490 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-public-tls-certs\") pod \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867538 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-scripts\") pod \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.867564 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-config-data\") pod \"70e2b1c5-03aa-4472-9002-7daf936edc67\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.870732 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlwln\" (UniqueName: \"kubernetes.io/projected/70e2b1c5-03aa-4472-9002-7daf936edc67-kube-api-access-hlwln\") pod \"70e2b1c5-03aa-4472-9002-7daf936edc67\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.870751 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e696d7-7767-4a92-9828-a189ffb52275-logs\") pod \"23e696d7-7767-4a92-9828-a189ffb52275\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.870772 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-kolla-config\") pod \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.870802 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"aa57b1fb-c743-4137-9501-a0110f385b1c\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.870836 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-internal-tls-certs\") pod \"aa57b1fb-c743-4137-9501-a0110f385b1c\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.870864 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa57b1fb-c743-4137-9501-a0110f385b1c-logs\") pod \"aa57b1fb-c743-4137-9501-a0110f385b1c\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.870890 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-public-tls-certs\") pod \"23e696d7-7767-4a92-9828-a189ffb52275\" (UID: \"23e696d7-7767-4a92-9828-a189ffb52275\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.870914 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-combined-ca-bundle\") pod \"aa57b1fb-c743-4137-9501-a0110f385b1c\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.870953 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-internal-tls-certs\") pod \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.870976 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-combined-ca-bundle\") pod \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\" (UID: \"033164fc-5a6f-4b9d-8c3a-1e4242078c9e\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.871000 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-config-data\") pod \"aa57b1fb-c743-4137-9501-a0110f385b1c\" (UID: \"aa57b1fb-c743-4137-9501-a0110f385b1c\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.871044 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-state-metrics-tls-config\") pod \"fded3e15-f946-4f86-bed4-2c4a3262395a\" (UID: \"fded3e15-f946-4f86-bed4-2c4a3262395a\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.871073 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-combined-ca-bundle\") pod \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.871204 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/815c282e-cc40-4ff8-b3f8-155d9a91a20b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "815c282e-cc40-4ff8-b3f8-155d9a91a20b" (UID: "815c282e-cc40-4ff8-b3f8-155d9a91a20b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.871636 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-config-data-custom\") pod \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\" (UID: \"815c282e-cc40-4ff8-b3f8-155d9a91a20b\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.871668 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-config-data-custom\") pod \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.871696 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc554\" (UniqueName: \"kubernetes.io/projected/a9be32ba-d183-4fd5-ba8b-63f79c973c81-kube-api-access-hc554\") pod \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\" (UID: \"a9be32ba-d183-4fd5-ba8b-63f79c973c81\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.871725 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-combined-ca-bundle\") pod \"70e2b1c5-03aa-4472-9002-7daf936edc67\" (UID: \"70e2b1c5-03aa-4472-9002-7daf936edc67\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.871954 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-config-data" (OuterVolumeSpecName: "config-data") pod "033164fc-5a6f-4b9d-8c3a-1e4242078c9e" (UID: "033164fc-5a6f-4b9d-8c3a-1e4242078c9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872472 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jd6h\" (UniqueName: \"kubernetes.io/projected/50622552-6b5c-4af5-a457-09c526c54f3f-kube-api-access-5jd6h\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872489 4991 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872510 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872520 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872529 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg7vn\" (UniqueName: \"kubernetes.io/projected/4f1297ce-72cf-4b07-a66d-826e8e9c1663-kube-api-access-zg7vn\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872538 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/feb6a9a7-403e-4dc9-903c-349391d84efb-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872546 4991 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/815c282e-cc40-4ff8-b3f8-155d9a91a20b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872556 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6bxl\" (UniqueName: \"kubernetes.io/projected/305f56cb-d896-435c-ae06-4a407714b503-kube-api-access-w6bxl\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872565 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872573 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/157f3f65-3397-4a2d-98ea-1ae5897c7a76-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872582 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872591 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjp9s\" (UniqueName: \"kubernetes.io/projected/d1a24973-6ef6-4732-9a96-040ce646a707-kube-api-access-tjp9s\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872599 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872612 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872621 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdf7w\" (UniqueName: \"kubernetes.io/projected/feb6a9a7-403e-4dc9-903c-349391d84efb-kube-api-access-xdf7w\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872630 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/157f3f65-3397-4a2d-98ea-1ae5897c7a76-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872638 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1a24973-6ef6-4732-9a96-040ce646a707-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.872767 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23e696d7-7767-4a92-9828-a189ffb52275-logs" (OuterVolumeSpecName: "logs") pod "23e696d7-7767-4a92-9828-a189ffb52275" (UID: "23e696d7-7767-4a92-9828-a189ffb52275"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.877181 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9be32ba-d183-4fd5-ba8b-63f79c973c81-logs" (OuterVolumeSpecName: "logs") pod "a9be32ba-d183-4fd5-ba8b-63f79c973c81" (UID: "a9be32ba-d183-4fd5-ba8b-63f79c973c81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.881069 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "033164fc-5a6f-4b9d-8c3a-1e4242078c9e" (UID: "033164fc-5a6f-4b9d-8c3a-1e4242078c9e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.882601 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70e2b1c5-03aa-4472-9002-7daf936edc67-logs" (OuterVolumeSpecName: "logs") pod "70e2b1c5-03aa-4472-9002-7daf936edc67" (UID: "70e2b1c5-03aa-4472-9002-7daf936edc67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.882951 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa57b1fb-c743-4137-9501-a0110f385b1c-logs" (OuterVolumeSpecName: "logs") pod "aa57b1fb-c743-4137-9501-a0110f385b1c" (UID: "aa57b1fb-c743-4137-9501-a0110f385b1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.894998 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-scripts" (OuterVolumeSpecName: "scripts") pod "aa57b1fb-c743-4137-9501-a0110f385b1c" (UID: "aa57b1fb-c743-4137-9501-a0110f385b1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.895381 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa57b1fb-c743-4137-9501-a0110f385b1c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "aa57b1fb-c743-4137-9501-a0110f385b1c" (UID: "aa57b1fb-c743-4137-9501-a0110f385b1c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.900778 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815c282e-cc40-4ff8-b3f8-155d9a91a20b-logs" (OuterVolumeSpecName: "logs") pod "815c282e-cc40-4ff8-b3f8-155d9a91a20b" (UID: "815c282e-cc40-4ff8-b3f8-155d9a91a20b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.906660 4991 scope.go:117] "RemoveContainer" containerID="c6b5f60779d336e5fbce098f6d9e1800e575b0fba43736ffc5139e17473ca9ec" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.933187 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b53689-326b-4f4c-a625-beec7a3631fa-kube-api-access-xfnrn" (OuterVolumeSpecName: "kube-api-access-xfnrn") pod "c5b53689-326b-4f4c-a625-beec7a3631fa" (UID: "c5b53689-326b-4f4c-a625-beec7a3631fa"). InnerVolumeSpecName "kube-api-access-xfnrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.933266 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-scripts" (OuterVolumeSpecName: "scripts") pod "815c282e-cc40-4ff8-b3f8-155d9a91a20b" (UID: "815c282e-cc40-4ff8-b3f8-155d9a91a20b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.933844 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a9be32ba-d183-4fd5-ba8b-63f79c973c81" (UID: "a9be32ba-d183-4fd5-ba8b-63f79c973c81"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.936101 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "aa57b1fb-c743-4137-9501-a0110f385b1c" (UID: "aa57b1fb-c743-4137-9501-a0110f385b1c"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.940824 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "815c282e-cc40-4ff8-b3f8-155d9a91a20b" (UID: "815c282e-cc40-4ff8-b3f8-155d9a91a20b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.946040 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9be32ba-d183-4fd5-ba8b-63f79c973c81-kube-api-access-hc554" (OuterVolumeSpecName: "kube-api-access-hc554") pod "a9be32ba-d183-4fd5-ba8b-63f79c973c81" (UID: "a9be32ba-d183-4fd5-ba8b-63f79c973c81"). InnerVolumeSpecName "kube-api-access-hc554". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.948257 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e2b1c5-03aa-4472-9002-7daf936edc67-kube-api-access-hlwln" (OuterVolumeSpecName: "kube-api-access-hlwln") pod "70e2b1c5-03aa-4472-9002-7daf936edc67" (UID: "70e2b1c5-03aa-4472-9002-7daf936edc67"). InnerVolumeSpecName "kube-api-access-hlwln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.962439 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815c282e-cc40-4ff8-b3f8-155d9a91a20b-kube-api-access-zb6nm" (OuterVolumeSpecName: "kube-api-access-zb6nm") pod "815c282e-cc40-4ff8-b3f8-155d9a91a20b" (UID: "815c282e-cc40-4ff8-b3f8-155d9a91a20b"). InnerVolumeSpecName "kube-api-access-zb6nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.962739 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e696d7-7767-4a92-9828-a189ffb52275-kube-api-access-7jdfx" (OuterVolumeSpecName: "kube-api-access-7jdfx") pod "23e696d7-7767-4a92-9828-a189ffb52275" (UID: "23e696d7-7767-4a92-9828-a189ffb52275"). InnerVolumeSpecName "kube-api-access-7jdfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.962757 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa57b1fb-c743-4137-9501-a0110f385b1c-kube-api-access-t6dk6" (OuterVolumeSpecName: "kube-api-access-t6dk6") pod "aa57b1fb-c743-4137-9501-a0110f385b1c" (UID: "aa57b1fb-c743-4137-9501-a0110f385b1c"). InnerVolumeSpecName "kube-api-access-t6dk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.962772 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-kube-api-access-mtxm2" (OuterVolumeSpecName: "kube-api-access-mtxm2") pod "033164fc-5a6f-4b9d-8c3a-1e4242078c9e" (UID: "033164fc-5a6f-4b9d-8c3a-1e4242078c9e"). InnerVolumeSpecName "kube-api-access-mtxm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.962786 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-api-access-4vg9r" (OuterVolumeSpecName: "kube-api-access-4vg9r") pod "fded3e15-f946-4f86-bed4-2c4a3262395a" (UID: "fded3e15-f946-4f86-bed4-2c4a3262395a"). InnerVolumeSpecName "kube-api-access-4vg9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.970776 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-config-data" (OuterVolumeSpecName: "config-data") pod "d1a24973-6ef6-4732-9a96-040ce646a707" (UID: "d1a24973-6ef6-4732-9a96-040ce646a707"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.976683 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pmvv\" (UniqueName: \"kubernetes.io/projected/f2791937-a79f-4d99-b895-6d3ac79ba220-kube-api-access-8pmvv\") pod \"f2791937-a79f-4d99-b895-6d3ac79ba220\" (UID: \"f2791937-a79f-4d99-b895-6d3ac79ba220\") " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977047 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6dk6\" (UniqueName: \"kubernetes.io/projected/aa57b1fb-c743-4137-9501-a0110f385b1c-kube-api-access-t6dk6\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977182 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtxm2\" (UniqueName: \"kubernetes.io/projected/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-kube-api-access-mtxm2\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977217 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977239 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb6nm\" (UniqueName: \"kubernetes.io/projected/815c282e-cc40-4ff8-b3f8-155d9a91a20b-kube-api-access-zb6nm\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977250 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9be32ba-d183-4fd5-ba8b-63f79c973c81-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977260 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vg9r\" (UniqueName: \"kubernetes.io/projected/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-api-access-4vg9r\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977271 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/815c282e-cc40-4ff8-b3f8-155d9a91a20b-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977280 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977288 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa57b1fb-c743-4137-9501-a0110f385b1c-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977309 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jdfx\" (UniqueName: \"kubernetes.io/projected/23e696d7-7767-4a92-9828-a189ffb52275-kube-api-access-7jdfx\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977318 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977326 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e696d7-7767-4a92-9828-a189ffb52275-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977335 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlwln\" (UniqueName: \"kubernetes.io/projected/70e2b1c5-03aa-4472-9002-7daf936edc67-kube-api-access-hlwln\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977347 4991 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977373 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977383 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa57b1fb-c743-4137-9501-a0110f385b1c-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977394 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977404 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977414 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc554\" (UniqueName: \"kubernetes.io/projected/a9be32ba-d183-4fd5-ba8b-63f79c973c81-kube-api-access-hc554\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977425 4991 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e2b1c5-03aa-4472-9002-7daf936edc67-logs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.977434 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfnrn\" (UniqueName: \"kubernetes.io/projected/c5b53689-326b-4f4c-a625-beec7a3631fa-kube-api-access-xfnrn\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:40 crc kubenswrapper[4991]: I1006 08:42:40.996009 4991 scope.go:117] "RemoveContainer" containerID="2b98780e70d84a8aec415e425c48e44718a23c872945a5f7884260c8ef099a6e" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.004797 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "157f3f65-3397-4a2d-98ea-1ae5897c7a76" (UID: "157f3f65-3397-4a2d-98ea-1ae5897c7a76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.005805 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2791937-a79f-4d99-b895-6d3ac79ba220-kube-api-access-8pmvv" (OuterVolumeSpecName: "kube-api-access-8pmvv") pod "f2791937-a79f-4d99-b895-6d3ac79ba220" (UID: "f2791937-a79f-4d99-b895-6d3ac79ba220"). InnerVolumeSpecName "kube-api-access-8pmvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.027661 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.043425 4991 scope.go:117] "RemoveContainer" containerID="7145e5dc1f6f11d5b7c94e4ed5a3f94d613b31585eea12e4bfb621d2b89f737e" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.078945 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.078968 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.078978 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pmvv\" (UniqueName: \"kubernetes.io/projected/f2791937-a79f-4d99-b895-6d3ac79ba220-kube-api-access-8pmvv\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.091668 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.112493 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d1a24973-6ef6-4732-9a96-040ce646a707" (UID: "d1a24973-6ef6-4732-9a96-040ce646a707"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.147197 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "815c282e-cc40-4ff8-b3f8-155d9a91a20b" (UID: "815c282e-cc40-4ff8-b3f8-155d9a91a20b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.194698 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1a24973-6ef6-4732-9a96-040ce646a707-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.194736 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.194757 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.232062 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70e2b1c5-03aa-4472-9002-7daf936edc67" (UID: "70e2b1c5-03aa-4472-9002-7daf936edc67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.239542 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-config-data" (OuterVolumeSpecName: "config-data") pod "815c282e-cc40-4ff8-b3f8-155d9a91a20b" (UID: "815c282e-cc40-4ff8-b3f8-155d9a91a20b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.245586 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.255369 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="674d98a6-e32d-47c6-bf03-4ecdc611beb4" path="/var/lib/kubelet/pods/674d98a6-e32d-47c6-bf03-4ecdc611beb4/volumes" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.256036 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801bcc07-7874-4eb8-8447-40178d80ea09" path="/var/lib/kubelet/pods/801bcc07-7874-4eb8-8447-40178d80ea09/volumes" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.256594 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87faa73b-1148-48ae-88f4-3bdd06898658" path="/var/lib/kubelet/pods/87faa73b-1148-48ae-88f4-3bdd06898658/volumes" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.257555 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab7f3760-250c-4e34-8bde-7e9218b711ff" path="/var/lib/kubelet/pods/ab7f3760-250c-4e34-8bde-7e9218b711ff/volumes" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.258303 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2720ee8-eb06-4a0b-9bee-153b69ee769e" path="/var/lib/kubelet/pods/b2720ee8-eb06-4a0b-9bee-153b69ee769e/volumes" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.258896 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ee6cf5-c041-47cf-aace-ccc53c7d2092" path="/var/lib/kubelet/pods/d5ee6cf5-c041-47cf-aace-ccc53c7d2092/volumes" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.259780 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4175b5d-7866-481a-a923-1ae5f3307195" path="/var/lib/kubelet/pods/f4175b5d-7866-481a-a923-1ae5f3307195/volumes" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.274083 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-config-data" (OuterVolumeSpecName: "config-data") pod "aa57b1fb-c743-4137-9501-a0110f385b1c" (UID: "aa57b1fb-c743-4137-9501-a0110f385b1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.274101 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9be32ba-d183-4fd5-ba8b-63f79c973c81" (UID: "a9be32ba-d183-4fd5-ba8b-63f79c973c81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.294766 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "feb6a9a7-403e-4dc9-903c-349391d84efb" (UID: "feb6a9a7-403e-4dc9-903c-349391d84efb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.295927 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.295945 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.295955 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.295963 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.295973 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.295981 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.330396 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-config-data" (OuterVolumeSpecName: "config-data") pod "a9be32ba-d183-4fd5-ba8b-63f79c973c81" (UID: "a9be32ba-d183-4fd5-ba8b-63f79c973c81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.357991 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1ea2a-account-delete-5smxw" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.362197 4991 generic.go:334] "Generic (PLEG): container finished" podID="697548ef-9b89-4827-a5f1-4e535ae94722" containerID="758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98" exitCode=0 Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.363516 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b53689-326b-4f4c-a625-beec7a3631fa-config-data" (OuterVolumeSpecName: "config-data") pod "c5b53689-326b-4f4c-a625-beec7a3631fa" (UID: "c5b53689-326b-4f4c-a625-beec7a3631fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.365587 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicancd31-account-delete-l9h8d" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.369169 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.369463 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.369616 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.369761 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance15db-account-delete-rpvcb" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.374124 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.374618 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-548cc795f4-8m4d9" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.374714 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b98fcbb5b-2m256" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.374748 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronb842-account-delete-b9lmt" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.374779 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.374803 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.374815 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.374840 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.374926 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.379067 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "fded3e15-f946-4f86-bed4-2c4a3262395a" (UID: "fded3e15-f946-4f86-bed4-2c4a3262395a"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.400042 4991 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.400072 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.400086 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b53689-326b-4f4c-a625-beec7a3631fa-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.403898 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "feb6a9a7-403e-4dc9-903c-349391d84efb" (UID: "feb6a9a7-403e-4dc9-903c-349391d84efb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.414785 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "815c282e-cc40-4ff8-b3f8-155d9a91a20b" (UID: "815c282e-cc40-4ff8-b3f8-155d9a91a20b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.423890 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fded3e15-f946-4f86-bed4-2c4a3262395a" (UID: "fded3e15-f946-4f86-bed4-2c4a3262395a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.432389 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "feb6a9a7-403e-4dc9-903c-349391d84efb" (UID: "feb6a9a7-403e-4dc9-903c-349391d84efb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.439973 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-config-data" (OuterVolumeSpecName: "config-data") pod "70e2b1c5-03aa-4472-9002-7daf936edc67" (UID: "70e2b1c5-03aa-4472-9002-7daf936edc67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.441693 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "157f3f65-3397-4a2d-98ea-1ae5897c7a76" (UID: "157f3f65-3397-4a2d-98ea-1ae5897c7a76"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.446780 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "70e2b1c5-03aa-4472-9002-7daf936edc67" (UID: "70e2b1c5-03aa-4472-9002-7daf936edc67"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.456544 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "033164fc-5a6f-4b9d-8c3a-1e4242078c9e" (UID: "033164fc-5a6f-4b9d-8c3a-1e4242078c9e"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.457267 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23e696d7-7767-4a92-9828-a189ffb52275" (UID: "23e696d7-7767-4a92-9828-a189ffb52275"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.463592 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a9be32ba-d183-4fd5-ba8b-63f79c973c81" (UID: "a9be32ba-d183-4fd5-ba8b-63f79c973c81"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.470388 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "033164fc-5a6f-4b9d-8c3a-1e4242078c9e" (UID: "033164fc-5a6f-4b9d-8c3a-1e4242078c9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.472252 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-config-data" (OuterVolumeSpecName: "config-data") pod "23e696d7-7767-4a92-9828-a189ffb52275" (UID: "23e696d7-7767-4a92-9828-a189ffb52275"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.481789 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "23e696d7-7767-4a92-9828-a189ffb52275" (UID: "23e696d7-7767-4a92-9828-a189ffb52275"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501484 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501510 4991 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/157f3f65-3397-4a2d-98ea-1ae5897c7a76-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501520 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501530 4991 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/033164fc-5a6f-4b9d-8c3a-1e4242078c9e-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501539 4991 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501549 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501557 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501566 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501574 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501593 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501600 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501609 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.501617 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e2b1c5-03aa-4472-9002-7daf936edc67-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.502410 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b53689-326b-4f4c-a625-beec7a3631fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5b53689-326b-4f4c-a625-beec7a3631fa" (UID: "c5b53689-326b-4f4c-a625-beec7a3631fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.506606 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a9be32ba-d183-4fd5-ba8b-63f79c973c81" (UID: "a9be32ba-d183-4fd5-ba8b-63f79c973c81"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.510789 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "23e696d7-7767-4a92-9828-a189ffb52275" (UID: "23e696d7-7767-4a92-9828-a189ffb52275"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.512067 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "fded3e15-f946-4f86-bed4-2c4a3262395a" (UID: "fded3e15-f946-4f86-bed4-2c4a3262395a"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.516219 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.558276 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1ea2a-account-delete-5smxw" event={"ID":"f2791937-a79f-4d99-b895-6d3ac79ba220","Type":"ContainerDied","Data":"d9d372f6649b7ae2da7a793afd5a30828d575d6c4933357147853b3bad61a125"} Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.558771 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"697548ef-9b89-4827-a5f1-4e535ae94722","Type":"ContainerDied","Data":"758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98"} Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.558788 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"697548ef-9b89-4827-a5f1-4e535ae94722","Type":"ContainerDied","Data":"91e294a7b3bb344318caee75d31a75ac164141d0fe2fb46d42bd4c98fc504e8a"} Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.558801 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"033164fc-5a6f-4b9d-8c3a-1e4242078c9e","Type":"ContainerDied","Data":"44882ae90323aa32d1ae693d1acd497085921d30fdb2bc915ad61024da4420d8"} Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.558819 4991 scope.go:117] "RemoveContainer" containerID="cba219b856cd8afbdba07fd20346d5c2d98eb76e2dd1e02ef4b5e55c609acd84" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.558938 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa57b1fb-c743-4137-9501-a0110f385b1c" (UID: "aa57b1fb-c743-4137-9501-a0110f385b1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.585858 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-config-data" (OuterVolumeSpecName: "config-data") pod "feb6a9a7-403e-4dc9-903c-349391d84efb" (UID: "feb6a9a7-403e-4dc9-903c-349391d84efb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.587846 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "aa57b1fb-c743-4137-9501-a0110f385b1c" (UID: "aa57b1fb-c743-4137-9501-a0110f385b1c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.602776 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plvkz\" (UniqueName: \"kubernetes.io/projected/697548ef-9b89-4827-a5f1-4e535ae94722-kube-api-access-plvkz\") pod \"697548ef-9b89-4827-a5f1-4e535ae94722\" (UID: \"697548ef-9b89-4827-a5f1-4e535ae94722\") " Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.602834 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697548ef-9b89-4827-a5f1-4e535ae94722-config-data\") pod \"697548ef-9b89-4827-a5f1-4e535ae94722\" (UID: \"697548ef-9b89-4827-a5f1-4e535ae94722\") " Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.602863 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697548ef-9b89-4827-a5f1-4e535ae94722-combined-ca-bundle\") pod \"697548ef-9b89-4827-a5f1-4e535ae94722\" (UID: \"697548ef-9b89-4827-a5f1-4e535ae94722\") " Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.603203 4991 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fded3e15-f946-4f86-bed4-2c4a3262395a-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.603214 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9be32ba-d183-4fd5-ba8b-63f79c973c81-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.603223 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.603231 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e696d7-7767-4a92-9828-a189ffb52275-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.603239 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa57b1fb-c743-4137-9501-a0110f385b1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.603247 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6a9a7-403e-4dc9-903c-349391d84efb-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.603255 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b53689-326b-4f4c-a625-beec7a3631fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.610526 4991 scope.go:117] "RemoveContainer" containerID="758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.615867 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronb842-account-delete-b9lmt"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.616450 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "815c282e-cc40-4ff8-b3f8-155d9a91a20b" (UID: "815c282e-cc40-4ff8-b3f8-155d9a91a20b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.622965 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutronb842-account-delete-b9lmt"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.625667 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/697548ef-9b89-4827-a5f1-4e535ae94722-kube-api-access-plvkz" (OuterVolumeSpecName: "kube-api-access-plvkz") pod "697548ef-9b89-4827-a5f1-4e535ae94722" (UID: "697548ef-9b89-4827-a5f1-4e535ae94722"). InnerVolumeSpecName "kube-api-access-plvkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.636134 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicancd31-account-delete-l9h8d"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.645197 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbicancd31-account-delete-l9h8d"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.652244 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1ea2a-account-delete-5smxw"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.653939 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697548ef-9b89-4827-a5f1-4e535ae94722-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "697548ef-9b89-4827-a5f1-4e535ae94722" (UID: "697548ef-9b89-4827-a5f1-4e535ae94722"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.660673 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell1ea2a-account-delete-5smxw"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.663542 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697548ef-9b89-4827-a5f1-4e535ae94722-config-data" (OuterVolumeSpecName: "config-data") pod "697548ef-9b89-4827-a5f1-4e535ae94722" (UID: "697548ef-9b89-4827-a5f1-4e535ae94722"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.669144 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.679895 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.687589 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance15db-account-delete-rpvcb"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.693001 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance15db-account-delete-rpvcb"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.704314 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/815c282e-cc40-4ff8-b3f8-155d9a91a20b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.704345 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plvkz\" (UniqueName: \"kubernetes.io/projected/697548ef-9b89-4827-a5f1-4e535ae94722-kube-api-access-plvkz\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.704359 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/697548ef-9b89-4827-a5f1-4e535ae94722-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.704371 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/697548ef-9b89-4827-a5f1-4e535ae94722-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.712816 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.717474 4991 scope.go:117] "RemoveContainer" containerID="758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.721776 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.732958 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: E1006 08:42:41.736177 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98\": container with ID starting with 758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98 not found: ID does not exist" containerID="758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.736228 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98"} err="failed to get container status \"758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98\": rpc error: code = NotFound desc = could not find container \"758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98\": container with ID starting with 758630a21a63424a5807c33fac03ffb0fa1723ba48293b99eff3ac4735f5de98 not found: ID does not exist" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.736254 4991 scope.go:117] "RemoveContainer" containerID="beb72f2fe0b1d7a0ddb9249baead2d79de7b72973b4fde75ed6d9bc96c982e97" Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.741193 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.745636 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-548cc795f4-8m4d9"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.760376 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-548cc795f4-8m4d9"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.769623 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.776842 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.796890 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.823664 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.871279 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b98fcbb5b-2m256"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.876011 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6b98fcbb5b-2m256"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.877853 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.888364 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.896687 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.899314 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.905367 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: E1006 08:42:41.913840 4991 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 06 08:42:41 crc kubenswrapper[4991]: E1006 08:42:41.913896 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data podName:1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:49.913882305 +0000 UTC m=+1421.651632326 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data") pod "rabbitmq-cell1-server-0" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1") : configmap "rabbitmq-cell1-config-data" not found Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.916715 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.922726 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.927364 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 08:42:41 crc kubenswrapper[4991]: I1006 08:42:41.980388 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.015203 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-config-data-default\") pod \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.015319 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-galera-tls-certs\") pod \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.015371 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5cbx\" (UniqueName: \"kubernetes.io/projected/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-kube-api-access-c5cbx\") pod \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.015408 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-secrets\") pod \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.015437 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-kolla-config\") pod \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.015474 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-config-data-generated\") pod \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.015511 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-combined-ca-bundle\") pod \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.015539 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-operator-scripts\") pod \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.015559 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\" (UID: \"d1f986ad-8a8d-44d3-b200-479a60f8b8b3\") " Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.016041 4991 secret.go:188] Couldn't get secret openstack/neutron-httpd-config: secret "neutron-httpd-config" not found Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.016092 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config podName:0a6703e0-1fac-4734-98ac-88f6163fdaae nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.016078011 +0000 UTC m=+1421.753828032 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "httpd-config" (UniqueName: "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config") pod "neutron-7988dccf5c-j9ll7" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae") : secret "neutron-httpd-config" not found Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.016302 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "d1f986ad-8a8d-44d3-b200-479a60f8b8b3" (UID: "d1f986ad-8a8d-44d3-b200-479a60f8b8b3"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.017389 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d1f986ad-8a8d-44d3-b200-479a60f8b8b3" (UID: "d1f986ad-8a8d-44d3-b200-479a60f8b8b3"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.017545 4991 secret.go:188] Couldn't get secret openstack/neutron-config: secret "neutron-config" not found Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.017600 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config podName:0a6703e0-1fac-4734-98ac-88f6163fdaae nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.017584025 +0000 UTC m=+1421.755334036 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config") pod "neutron-7988dccf5c-j9ll7" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae") : secret "neutron-config" not found Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.017856 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1f986ad-8a8d-44d3-b200-479a60f8b8b3" (UID: "d1f986ad-8a8d-44d3-b200-479a60f8b8b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.017922 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "d1f986ad-8a8d-44d3-b200-479a60f8b8b3" (UID: "d1f986ad-8a8d-44d3-b200-479a60f8b8b3"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.022002 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-secrets" (OuterVolumeSpecName: "secrets") pod "d1f986ad-8a8d-44d3-b200-479a60f8b8b3" (UID: "d1f986ad-8a8d-44d3-b200-479a60f8b8b3"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.024096 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-kube-api-access-c5cbx" (OuterVolumeSpecName: "kube-api-access-c5cbx") pod "d1f986ad-8a8d-44d3-b200-479a60f8b8b3" (UID: "d1f986ad-8a8d-44d3-b200-479a60f8b8b3"). InnerVolumeSpecName "kube-api-access-c5cbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.028193 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "d1f986ad-8a8d-44d3-b200-479a60f8b8b3" (UID: "d1f986ad-8a8d-44d3-b200-479a60f8b8b3"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.057404 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1f986ad-8a8d-44d3-b200-479a60f8b8b3" (UID: "d1f986ad-8a8d-44d3-b200-479a60f8b8b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.074622 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "d1f986ad-8a8d-44d3-b200-479a60f8b8b3" (UID: "d1f986ad-8a8d-44d3-b200-479a60f8b8b3"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.078206 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.078515 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.078851 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.078885 4991 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server" Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.079333 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.080469 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.081904 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.081943 4991 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovs-vswitchd" Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.094246 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4f4325397287518c3ecb285a52c75cc737cf34c7fece8ee912a41c376bf55696" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.095561 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4f4325397287518c3ecb285a52c75cc737cf34c7fece8ee912a41c376bf55696" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.097014 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4f4325397287518c3ecb285a52c75cc737cf34c7fece8ee912a41c376bf55696" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.097057 4991 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="48f4202b-6558-4fe3-8fcc-732aa1a88e60" containerName="nova-scheduler-scheduler" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.117757 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.117784 4991 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.117793 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5cbx\" (UniqueName: \"kubernetes.io/projected/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-kube-api-access-c5cbx\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.117802 4991 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.117810 4991 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.117818 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.117826 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.117834 4991 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1f986ad-8a8d-44d3-b200-479a60f8b8b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.117861 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.156168 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.219327 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.375034 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.385472 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bc2q" event={"ID":"544d772e-ee45-4bd6-9895-07dec1dc3ff1","Type":"ContainerStarted","Data":"11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1"} Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.387644 4991 generic.go:334] "Generic (PLEG): container finished" podID="d1f986ad-8a8d-44d3-b200-479a60f8b8b3" containerID="9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc" exitCode=0 Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.387676 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d1f986ad-8a8d-44d3-b200-479a60f8b8b3","Type":"ContainerDied","Data":"9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc"} Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.387691 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d1f986ad-8a8d-44d3-b200-479a60f8b8b3","Type":"ContainerDied","Data":"f7fb7232d6c2b21d3774a6e1ede7c12929d141e3b2231f4a00567a59297a81bf"} Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.387707 4991 scope.go:117] "RemoveContainer" containerID="9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.387768 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.416924 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8bc2q" podStartSLOduration=4.73869057 podStartE2EDuration="10.416908675s" podCreationTimestamp="2025-10-06 08:42:32 +0000 UTC" firstStartedPulling="2025-10-06 08:42:35.228429246 +0000 UTC m=+1406.966179267" lastFinishedPulling="2025-10-06 08:42:40.906647351 +0000 UTC m=+1412.644397372" observedRunningTime="2025-10-06 08:42:42.414372521 +0000 UTC m=+1414.152122542" watchObservedRunningTime="2025-10-06 08:42:42.416908675 +0000 UTC m=+1414.154658696" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.490766 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.495434 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.507134 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.515592 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.517420 4991 scope.go:117] "RemoveContainer" containerID="96202f33a285fac8c19a37ae1518bcc574673c7c2cd374e1b1aafd4665b248fc" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.552183 4991 scope.go:117] "RemoveContainer" containerID="9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc" Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.552787 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc\": container with ID starting with 9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc not found: ID does not exist" containerID="9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.552826 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc"} err="failed to get container status \"9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc\": rpc error: code = NotFound desc = could not find container \"9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc\": container with ID starting with 9811e310797a3cb780e598d5258aa4cedf3ddc92e273e821bad74ab174458cdc not found: ID does not exist" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.552851 4991 scope.go:117] "RemoveContainer" containerID="96202f33a285fac8c19a37ae1518bcc574673c7c2cd374e1b1aafd4665b248fc" Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.553282 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96202f33a285fac8c19a37ae1518bcc574673c7c2cd374e1b1aafd4665b248fc\": container with ID starting with 96202f33a285fac8c19a37ae1518bcc574673c7c2cd374e1b1aafd4665b248fc not found: ID does not exist" containerID="96202f33a285fac8c19a37ae1518bcc574673c7c2cd374e1b1aafd4665b248fc" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.553329 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96202f33a285fac8c19a37ae1518bcc574673c7c2cd374e1b1aafd4665b248fc"} err="failed to get container status \"96202f33a285fac8c19a37ae1518bcc574673c7c2cd374e1b1aafd4665b248fc\": rpc error: code = NotFound desc = could not find container \"96202f33a285fac8c19a37ae1518bcc574673c7c2cd374e1b1aafd4665b248fc\": container with ID starting with 96202f33a285fac8c19a37ae1518bcc574673c7c2cd374e1b1aafd4665b248fc not found: ID does not exist" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.747483 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="53c6aca4-4fd0-4d42-bbe2-4b6e91643503" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.814992 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-jklxx" podUID="188f566f-7d4a-4b9f-b74d-bbee761c0bea" containerName="ovn-controller" probeResult="failure" output="command timed out" Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.831411 4991 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 06 08:42:42 crc kubenswrapper[4991]: E1006 08:42:42.831494 4991 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data podName:53c6aca4-4fd0-4d42-bbe2-4b6e91643503 nodeName:}" failed. No retries permitted until 2025-10-06 08:42:50.831473877 +0000 UTC m=+1422.569223918 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data") pod "rabbitmq-server-0" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503") : configmap "rabbitmq-config-data" not found Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.853478 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-jklxx" podUID="188f566f-7d4a-4b9f-b74d-bbee761c0bea" containerName="ovn-controller" probeResult="failure" output=< Oct 06 08:42:42 crc kubenswrapper[4991]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Oct 06 08:42:42 crc kubenswrapper[4991]: > Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.863717 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.932956 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-fernet-keys\") pod \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.933105 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-config-data\") pod \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.933156 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4zj7\" (UniqueName: \"kubernetes.io/projected/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-kube-api-access-d4zj7\") pod \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.933195 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-scripts\") pod \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.933255 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-public-tls-certs\") pod \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.933347 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-internal-tls-certs\") pod \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.933406 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-combined-ca-bundle\") pod \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.933481 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-credential-keys\") pod \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\" (UID: \"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb\") " Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.939163 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-kube-api-access-d4zj7" (OuterVolumeSpecName: "kube-api-access-d4zj7") pod "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" (UID: "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb"). InnerVolumeSpecName "kube-api-access-d4zj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.946060 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" (UID: "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.948669 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-scripts" (OuterVolumeSpecName: "scripts") pod "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" (UID: "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.952414 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" (UID: "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:42 crc kubenswrapper[4991]: I1006 08:42:42.996288 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-config-data" (OuterVolumeSpecName: "config-data") pod "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" (UID: "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.033952 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" (UID: "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.034937 4991 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.034974 4991 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.034987 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.035001 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4zj7\" (UniqueName: \"kubernetes.io/projected/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-kube-api-access-d4zj7\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.035014 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.035025 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.039381 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.039504 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.040810 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" (UID: "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.083486 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" (UID: "79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.093602 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.135831 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.135855 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.145459 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_51a7066c-5143-43ab-b642-81f461a9c1f4/ovn-northd/0.log" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.145505 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.236680 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51a7066c-5143-43ab-b642-81f461a9c1f4-ovn-rundir\") pod \"51a7066c-5143-43ab-b642-81f461a9c1f4\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.236853 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7dbb\" (UniqueName: \"kubernetes.io/projected/51a7066c-5143-43ab-b642-81f461a9c1f4-kube-api-access-r7dbb\") pod \"51a7066c-5143-43ab-b642-81f461a9c1f4\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.236881 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a7066c-5143-43ab-b642-81f461a9c1f4-config\") pod \"51a7066c-5143-43ab-b642-81f461a9c1f4\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.236911 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-metrics-certs-tls-certs\") pod \"51a7066c-5143-43ab-b642-81f461a9c1f4\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.236945 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-combined-ca-bundle\") pod \"51a7066c-5143-43ab-b642-81f461a9c1f4\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.236964 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51a7066c-5143-43ab-b642-81f461a9c1f4-scripts\") pod \"51a7066c-5143-43ab-b642-81f461a9c1f4\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.236985 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-ovn-northd-tls-certs\") pod \"51a7066c-5143-43ab-b642-81f461a9c1f4\" (UID: \"51a7066c-5143-43ab-b642-81f461a9c1f4\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.238257 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a7066c-5143-43ab-b642-81f461a9c1f4-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "51a7066c-5143-43ab-b642-81f461a9c1f4" (UID: "51a7066c-5143-43ab-b642-81f461a9c1f4"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.238483 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a7066c-5143-43ab-b642-81f461a9c1f4-config" (OuterVolumeSpecName: "config") pod "51a7066c-5143-43ab-b642-81f461a9c1f4" (UID: "51a7066c-5143-43ab-b642-81f461a9c1f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.239502 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a7066c-5143-43ab-b642-81f461a9c1f4-scripts" (OuterVolumeSpecName: "scripts") pod "51a7066c-5143-43ab-b642-81f461a9c1f4" (UID: "51a7066c-5143-43ab-b642-81f461a9c1f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.248625 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a7066c-5143-43ab-b642-81f461a9c1f4-kube-api-access-r7dbb" (OuterVolumeSpecName: "kube-api-access-r7dbb") pod "51a7066c-5143-43ab-b642-81f461a9c1f4" (UID: "51a7066c-5143-43ab-b642-81f461a9c1f4"). InnerVolumeSpecName "kube-api-access-r7dbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.255331 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="033164fc-5a6f-4b9d-8c3a-1e4242078c9e" path="/var/lib/kubelet/pods/033164fc-5a6f-4b9d-8c3a-1e4242078c9e/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.260306 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="157f3f65-3397-4a2d-98ea-1ae5897c7a76" path="/var/lib/kubelet/pods/157f3f65-3397-4a2d-98ea-1ae5897c7a76/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.264920 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e696d7-7767-4a92-9828-a189ffb52275" path="/var/lib/kubelet/pods/23e696d7-7767-4a92-9828-a189ffb52275/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.268313 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.278427 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305f56cb-d896-435c-ae06-4a407714b503" path="/var/lib/kubelet/pods/305f56cb-d896-435c-ae06-4a407714b503/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.278873 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f1297ce-72cf-4b07-a66d-826e8e9c1663" path="/var/lib/kubelet/pods/4f1297ce-72cf-4b07-a66d-826e8e9c1663/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.285705 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50622552-6b5c-4af5-a457-09c526c54f3f" path="/var/lib/kubelet/pods/50622552-6b5c-4af5-a457-09c526c54f3f/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.286157 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="697548ef-9b89-4827-a5f1-4e535ae94722" path="/var/lib/kubelet/pods/697548ef-9b89-4827-a5f1-4e535ae94722/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.286316 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51a7066c-5143-43ab-b642-81f461a9c1f4" (UID: "51a7066c-5143-43ab-b642-81f461a9c1f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.286664 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" path="/var/lib/kubelet/pods/70e2b1c5-03aa-4472-9002-7daf936edc67/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.287706 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815c282e-cc40-4ff8-b3f8-155d9a91a20b" path="/var/lib/kubelet/pods/815c282e-cc40-4ff8-b3f8-155d9a91a20b/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.288525 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" path="/var/lib/kubelet/pods/a9be32ba-d183-4fd5-ba8b-63f79c973c81/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.289308 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa57b1fb-c743-4137-9501-a0110f385b1c" path="/var/lib/kubelet/pods/aa57b1fb-c743-4137-9501-a0110f385b1c/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.290413 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5b53689-326b-4f4c-a625-beec7a3631fa" path="/var/lib/kubelet/pods/c5b53689-326b-4f4c-a625-beec7a3631fa/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.291124 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a24973-6ef6-4732-9a96-040ce646a707" path="/var/lib/kubelet/pods/d1a24973-6ef6-4732-9a96-040ce646a707/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.297744 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1f986ad-8a8d-44d3-b200-479a60f8b8b3" path="/var/lib/kubelet/pods/d1f986ad-8a8d-44d3-b200-479a60f8b8b3/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.298341 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2791937-a79f-4d99-b895-6d3ac79ba220" path="/var/lib/kubelet/pods/f2791937-a79f-4d99-b895-6d3ac79ba220/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.298900 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fded3e15-f946-4f86-bed4-2c4a3262395a" path="/var/lib/kubelet/pods/fded3e15-f946-4f86-bed4-2c4a3262395a/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.300100 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb6a9a7-403e-4dc9-903c-349391d84efb" path="/var/lib/kubelet/pods/feb6a9a7-403e-4dc9-903c-349391d84efb/volumes" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.326991 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="f4175b5d-7866-481a-a923-1ae5f3307195" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.195:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.338715 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-pod-info\") pod \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.338769 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-tls\") pod \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.338786 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-confd\") pod \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.338849 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-plugins\") pod \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.338886 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-erlang-cookie\") pod \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.338914 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-server-conf\") pod \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.338948 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-plugins-conf\") pod \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.338977 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7987\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-kube-api-access-w7987\") pod \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.339029 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-erlang-cookie-secret\") pod \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.339064 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data\") pod \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.339079 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\" (UID: \"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.339506 4991 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51a7066c-5143-43ab-b642-81f461a9c1f4-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.339523 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7dbb\" (UniqueName: \"kubernetes.io/projected/51a7066c-5143-43ab-b642-81f461a9c1f4-kube-api-access-r7dbb\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.339534 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a7066c-5143-43ab-b642-81f461a9c1f4-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.339543 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.339551 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51a7066c-5143-43ab-b642-81f461a9c1f4-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.340538 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.340557 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "51a7066c-5143-43ab-b642-81f461a9c1f4" (UID: "51a7066c-5143-43ab-b642-81f461a9c1f4"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.341227 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.341252 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.343096 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-kube-api-access-w7987" (OuterVolumeSpecName: "kube-api-access-w7987") pod "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1"). InnerVolumeSpecName "kube-api-access-w7987". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.343868 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.345883 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.345889 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-pod-info" (OuterVolumeSpecName: "pod-info") pod "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.346099 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.352476 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "51a7066c-5143-43ab-b642-81f461a9c1f4" (UID: "51a7066c-5143-43ab-b642-81f461a9c1f4"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.362422 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.369699 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data" (OuterVolumeSpecName: "config-data") pod "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.401104 4991 generic.go:334] "Generic (PLEG): container finished" podID="79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" containerID="0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0" exitCode=0 Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.401159 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-774597bb4-6c42q" event={"ID":"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb","Type":"ContainerDied","Data":"0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0"} Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.401183 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-774597bb4-6c42q" event={"ID":"79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb","Type":"ContainerDied","Data":"f68568dfb392c06fa1a40abaa08c31d698aef3315cd0b120a40469dd179eccab"} Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.401199 4991 scope.go:117] "RemoveContainer" containerID="0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.401284 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-774597bb4-6c42q" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.408726 4991 generic.go:334] "Generic (PLEG): container finished" podID="1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" containerID="3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab" exitCode=0 Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.408790 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1","Type":"ContainerDied","Data":"3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab"} Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.408818 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1","Type":"ContainerDied","Data":"4c1ea580b6f933663c7bb9be9554723e3af455aced231068c8eb2051af6ac836"} Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.408904 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.411824 4991 generic.go:334] "Generic (PLEG): container finished" podID="53c6aca4-4fd0-4d42-bbe2-4b6e91643503" containerID="304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c" exitCode=0 Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.411904 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.411954 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53c6aca4-4fd0-4d42-bbe2-4b6e91643503","Type":"ContainerDied","Data":"304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c"} Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.412028 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53c6aca4-4fd0-4d42-bbe2-4b6e91643503","Type":"ContainerDied","Data":"396fb26a587e0452b10628b69988716675f31756c67979296f98a06fd8e6573a"} Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.414591 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_51a7066c-5143-43ab-b642-81f461a9c1f4/ovn-northd/0.log" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.414628 4991 generic.go:334] "Generic (PLEG): container finished" podID="51a7066c-5143-43ab-b642-81f461a9c1f4" containerID="fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865" exitCode=139 Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.414741 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.414838 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"51a7066c-5143-43ab-b642-81f461a9c1f4","Type":"ContainerDied","Data":"fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865"} Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.415073 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"51a7066c-5143-43ab-b642-81f461a9c1f4","Type":"ContainerDied","Data":"7d92250465135a992c3bbe54f25c5c368684c6fe1917679b89d7d898f54ea694"} Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.423025 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-774597bb4-6c42q"] Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.428725 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-server-conf" (OuterVolumeSpecName: "server-conf") pod "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.429608 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-774597bb4-6c42q"] Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.430574 4991 scope.go:117] "RemoveContainer" containerID="0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0" Oct 06 08:42:43 crc kubenswrapper[4991]: E1006 08:42:43.431026 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0\": container with ID starting with 0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0 not found: ID does not exist" containerID="0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.431108 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0"} err="failed to get container status \"0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0\": rpc error: code = NotFound desc = could not find container \"0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0\": container with ID starting with 0980eba7ede43f6be46933ab1054e82fec65cac2b559aba3b33b7152ea5dd5c0 not found: ID does not exist" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.431184 4991 scope.go:117] "RemoveContainer" containerID="3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.440285 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-plugins-conf\") pod \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.440364 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-pod-info\") pod \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.440435 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-tls\") pod \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.440460 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data\") pod \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.440480 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.440495 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjlld\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-kube-api-access-kjlld\") pod \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.440520 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-plugins\") pod \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.440560 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-confd\") pod \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.440579 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-erlang-cookie\") pod \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.440609 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-erlang-cookie-secret\") pod \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.440627 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-server-conf\") pod \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\" (UID: \"53c6aca4-4fd0-4d42-bbe2-4b6e91643503\") " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.440998 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441018 4991 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441028 4991 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441037 4991 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441046 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7987\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-kube-api-access-w7987\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441055 4991 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a7066c-5143-43ab-b642-81f461a9c1f4-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441063 4991 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441071 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441091 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441102 4991 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441110 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441118 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441774 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "53c6aca4-4fd0-4d42-bbe2-4b6e91643503" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.441837 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "53c6aca4-4fd0-4d42-bbe2-4b6e91643503" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.448288 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "53c6aca4-4fd0-4d42-bbe2-4b6e91643503" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.450528 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.456036 4991 scope.go:117] "RemoveContainer" containerID="545338f0b083a0c7cbfb9d9da6676198ff08693a5b75a48c77eeafe59d4fe381" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.457203 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.463742 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-kube-api-access-kjlld" (OuterVolumeSpecName: "kube-api-access-kjlld") pod "53c6aca4-4fd0-4d42-bbe2-4b6e91643503" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503"). InnerVolumeSpecName "kube-api-access-kjlld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.464109 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "53c6aca4-4fd0-4d42-bbe2-4b6e91643503" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.470819 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.480370 4991 scope.go:117] "RemoveContainer" containerID="3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab" Oct 06 08:42:43 crc kubenswrapper[4991]: E1006 08:42:43.480900 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab\": container with ID starting with 3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab not found: ID does not exist" containerID="3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.481043 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab"} err="failed to get container status \"3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab\": rpc error: code = NotFound desc = could not find container \"3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab\": container with ID starting with 3d4cd128f7e636b42c69415ad82cec49790dbc6a2344dadbfdf7b60644c454ab not found: ID does not exist" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.481162 4991 scope.go:117] "RemoveContainer" containerID="545338f0b083a0c7cbfb9d9da6676198ff08693a5b75a48c77eeafe59d4fe381" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.481183 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-pod-info" (OuterVolumeSpecName: "pod-info") pod "53c6aca4-4fd0-4d42-bbe2-4b6e91643503" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: E1006 08:42:43.481545 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"545338f0b083a0c7cbfb9d9da6676198ff08693a5b75a48c77eeafe59d4fe381\": container with ID starting with 545338f0b083a0c7cbfb9d9da6676198ff08693a5b75a48c77eeafe59d4fe381 not found: ID does not exist" containerID="545338f0b083a0c7cbfb9d9da6676198ff08693a5b75a48c77eeafe59d4fe381" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.481625 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"545338f0b083a0c7cbfb9d9da6676198ff08693a5b75a48c77eeafe59d4fe381"} err="failed to get container status \"545338f0b083a0c7cbfb9d9da6676198ff08693a5b75a48c77eeafe59d4fe381\": rpc error: code = NotFound desc = could not find container \"545338f0b083a0c7cbfb9d9da6676198ff08693a5b75a48c77eeafe59d4fe381\": container with ID starting with 545338f0b083a0c7cbfb9d9da6676198ff08693a5b75a48c77eeafe59d4fe381 not found: ID does not exist" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.481687 4991 scope.go:117] "RemoveContainer" containerID="304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.481709 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "53c6aca4-4fd0-4d42-bbe2-4b6e91643503" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.481874 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "53c6aca4-4fd0-4d42-bbe2-4b6e91643503" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.484981 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data" (OuterVolumeSpecName: "config-data") pod "53c6aca4-4fd0-4d42-bbe2-4b6e91643503" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.487158 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" (UID: "1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.498620 4991 scope.go:117] "RemoveContainer" containerID="30d12fe2a09790653c0ec3185c8f8c2cd238090b351db2e10e53d51862d3fb5f" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.499562 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-server-conf" (OuterVolumeSpecName: "server-conf") pod "53c6aca4-4fd0-4d42-bbe2-4b6e91643503" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.517228 4991 scope.go:117] "RemoveContainer" containerID="304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c" Oct 06 08:42:43 crc kubenswrapper[4991]: E1006 08:42:43.517848 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c\": container with ID starting with 304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c not found: ID does not exist" containerID="304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.517888 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c"} err="failed to get container status \"304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c\": rpc error: code = NotFound desc = could not find container \"304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c\": container with ID starting with 304b7cf63a4f3e3b8c50629ec01e30c12c0719866dc310ee305fe4c60546097c not found: ID does not exist" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.517913 4991 scope.go:117] "RemoveContainer" containerID="30d12fe2a09790653c0ec3185c8f8c2cd238090b351db2e10e53d51862d3fb5f" Oct 06 08:42:43 crc kubenswrapper[4991]: E1006 08:42:43.518237 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30d12fe2a09790653c0ec3185c8f8c2cd238090b351db2e10e53d51862d3fb5f\": container with ID starting with 30d12fe2a09790653c0ec3185c8f8c2cd238090b351db2e10e53d51862d3fb5f not found: ID does not exist" containerID="30d12fe2a09790653c0ec3185c8f8c2cd238090b351db2e10e53d51862d3fb5f" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.518259 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d12fe2a09790653c0ec3185c8f8c2cd238090b351db2e10e53d51862d3fb5f"} err="failed to get container status \"30d12fe2a09790653c0ec3185c8f8c2cd238090b351db2e10e53d51862d3fb5f\": rpc error: code = NotFound desc = could not find container \"30d12fe2a09790653c0ec3185c8f8c2cd238090b351db2e10e53d51862d3fb5f\": container with ID starting with 30d12fe2a09790653c0ec3185c8f8c2cd238090b351db2e10e53d51862d3fb5f not found: ID does not exist" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.518273 4991 scope.go:117] "RemoveContainer" containerID="9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.530591 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "53c6aca4-4fd0-4d42-bbe2-4b6e91643503" (UID: "53c6aca4-4fd0-4d42-bbe2-4b6e91643503"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542498 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542530 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542543 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542551 4991 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542560 4991 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542568 4991 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542578 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542586 4991 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542594 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542602 4991 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542611 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542637 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.542646 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjlld\" (UniqueName: \"kubernetes.io/projected/53c6aca4-4fd0-4d42-bbe2-4b6e91643503-kube-api-access-kjlld\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.546474 4991 scope.go:117] "RemoveContainer" containerID="fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.566553 4991 scope.go:117] "RemoveContainer" containerID="9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d" Oct 06 08:42:43 crc kubenswrapper[4991]: E1006 08:42:43.567027 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d\": container with ID starting with 9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d not found: ID does not exist" containerID="9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.567070 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d"} err="failed to get container status \"9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d\": rpc error: code = NotFound desc = could not find container \"9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d\": container with ID starting with 9cba8f75159395f6de9602d07155adf5089e892d1cddda8c71b67a527ac0670d not found: ID does not exist" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.567096 4991 scope.go:117] "RemoveContainer" containerID="fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865" Oct 06 08:42:43 crc kubenswrapper[4991]: E1006 08:42:43.569446 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865\": container with ID starting with fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865 not found: ID does not exist" containerID="fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.569486 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865"} err="failed to get container status \"fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865\": rpc error: code = NotFound desc = could not find container \"fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865\": container with ID starting with fc53492c9b9090465c39b8c9b33e53c74fbe8d5a91446c47a90a33f808b14865 not found: ID does not exist" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.571048 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.644371 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.757639 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.771673 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.779072 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:42:43 crc kubenswrapper[4991]: I1006 08:42:43.783087 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 08:42:44 crc kubenswrapper[4991]: E1006 08:42:44.098482 4991 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 06 08:42:44 crc kubenswrapper[4991]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-06T08:42:36Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 06 08:42:44 crc kubenswrapper[4991]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Oct 06 08:42:44 crc kubenswrapper[4991]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-jklxx" message=< Oct 06 08:42:44 crc kubenswrapper[4991]: Exiting ovn-controller (1) [FAILED] Oct 06 08:42:44 crc kubenswrapper[4991]: Killing ovn-controller (1) [ OK ] Oct 06 08:42:44 crc kubenswrapper[4991]: Killing ovn-controller (1) with SIGKILL [ OK ] Oct 06 08:42:44 crc kubenswrapper[4991]: 2025-10-06T08:42:36Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 06 08:42:44 crc kubenswrapper[4991]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Oct 06 08:42:44 crc kubenswrapper[4991]: > Oct 06 08:42:44 crc kubenswrapper[4991]: E1006 08:42:44.098520 4991 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 06 08:42:44 crc kubenswrapper[4991]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-06T08:42:36Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 06 08:42:44 crc kubenswrapper[4991]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Oct 06 08:42:44 crc kubenswrapper[4991]: > pod="openstack/ovn-controller-jklxx" podUID="188f566f-7d4a-4b9f-b74d-bbee761c0bea" containerName="ovn-controller" containerID="cri-o://9ce5d29d20258b039798707827257132b489fc0eb7a72cd595e6df3602e635af" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.098551 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-jklxx" podUID="188f566f-7d4a-4b9f-b74d-bbee761c0bea" containerName="ovn-controller" containerID="cri-o://9ce5d29d20258b039798707827257132b489fc0eb7a72cd595e6df3602e635af" gracePeriod=21 Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.429379 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-jklxx_188f566f-7d4a-4b9f-b74d-bbee761c0bea/ovn-controller/0.log" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.429651 4991 generic.go:334] "Generic (PLEG): container finished" podID="188f566f-7d4a-4b9f-b74d-bbee761c0bea" containerID="9ce5d29d20258b039798707827257132b489fc0eb7a72cd595e6df3602e635af" exitCode=137 Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.429726 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jklxx" event={"ID":"188f566f-7d4a-4b9f-b74d-bbee761c0bea","Type":"ContainerDied","Data":"9ce5d29d20258b039798707827257132b489fc0eb7a72cd595e6df3602e635af"} Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.433816 4991 generic.go:334] "Generic (PLEG): container finished" podID="48f4202b-6558-4fe3-8fcc-732aa1a88e60" containerID="4f4325397287518c3ecb285a52c75cc737cf34c7fece8ee912a41c376bf55696" exitCode=0 Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.433881 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"48f4202b-6558-4fe3-8fcc-732aa1a88e60","Type":"ContainerDied","Data":"4f4325397287518c3ecb285a52c75cc737cf34c7fece8ee912a41c376bf55696"} Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.439096 4991 generic.go:334] "Generic (PLEG): container finished" podID="0a6703e0-1fac-4734-98ac-88f6163fdaae" containerID="d0aac78aa43c86da1a2d4708b970a7fa2c38a878adf032b4bc160cf815163a9d" exitCode=0 Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.439361 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7988dccf5c-j9ll7" event={"ID":"0a6703e0-1fac-4734-98ac-88f6163fdaae","Type":"ContainerDied","Data":"d0aac78aa43c86da1a2d4708b970a7fa2c38a878adf032b4bc160cf815163a9d"} Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.544261 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-jklxx_188f566f-7d4a-4b9f-b74d-bbee761c0bea/ovn-controller/0.log" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.544348 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jklxx" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.554740 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.565062 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/188f566f-7d4a-4b9f-b74d-bbee761c0bea-scripts\") pod \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.565138 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-log-ovn\") pod \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.565265 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/188f566f-7d4a-4b9f-b74d-bbee761c0bea-ovn-controller-tls-certs\") pod \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.565285 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-run\") pod \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.565333 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp8dl\" (UniqueName: \"kubernetes.io/projected/188f566f-7d4a-4b9f-b74d-bbee761c0bea-kube-api-access-sp8dl\") pod \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.565347 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-run-ovn\") pod \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.565389 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188f566f-7d4a-4b9f-b74d-bbee761c0bea-combined-ca-bundle\") pod \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\" (UID: \"188f566f-7d4a-4b9f-b74d-bbee761c0bea\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.565592 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-run" (OuterVolumeSpecName: "var-run") pod "188f566f-7d4a-4b9f-b74d-bbee761c0bea" (UID: "188f566f-7d4a-4b9f-b74d-bbee761c0bea"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.565636 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "188f566f-7d4a-4b9f-b74d-bbee761c0bea" (UID: "188f566f-7d4a-4b9f-b74d-bbee761c0bea"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.565724 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "188f566f-7d4a-4b9f-b74d-bbee761c0bea" (UID: "188f566f-7d4a-4b9f-b74d-bbee761c0bea"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.566611 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/188f566f-7d4a-4b9f-b74d-bbee761c0bea-scripts" (OuterVolumeSpecName: "scripts") pod "188f566f-7d4a-4b9f-b74d-bbee761c0bea" (UID: "188f566f-7d4a-4b9f-b74d-bbee761c0bea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.570796 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/188f566f-7d4a-4b9f-b74d-bbee761c0bea-kube-api-access-sp8dl" (OuterVolumeSpecName: "kube-api-access-sp8dl") pod "188f566f-7d4a-4b9f-b74d-bbee761c0bea" (UID: "188f566f-7d4a-4b9f-b74d-bbee761c0bea"). InnerVolumeSpecName "kube-api-access-sp8dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.592866 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/188f566f-7d4a-4b9f-b74d-bbee761c0bea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "188f566f-7d4a-4b9f-b74d-bbee761c0bea" (UID: "188f566f-7d4a-4b9f-b74d-bbee761c0bea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.629573 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/188f566f-7d4a-4b9f-b74d-bbee761c0bea-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "188f566f-7d4a-4b9f-b74d-bbee761c0bea" (UID: "188f566f-7d4a-4b9f-b74d-bbee761c0bea"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.666701 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-public-tls-certs\") pod \"0a6703e0-1fac-4734-98ac-88f6163fdaae\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.666772 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-ovndb-tls-certs\") pod \"0a6703e0-1fac-4734-98ac-88f6163fdaae\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.666812 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config\") pod \"0a6703e0-1fac-4734-98ac-88f6163fdaae\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.666849 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxshx\" (UniqueName: \"kubernetes.io/projected/0a6703e0-1fac-4734-98ac-88f6163fdaae-kube-api-access-lxshx\") pod \"0a6703e0-1fac-4734-98ac-88f6163fdaae\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.666956 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-internal-tls-certs\") pod \"0a6703e0-1fac-4734-98ac-88f6163fdaae\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.667017 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config\") pod \"0a6703e0-1fac-4734-98ac-88f6163fdaae\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.667086 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-combined-ca-bundle\") pod \"0a6703e0-1fac-4734-98ac-88f6163fdaae\" (UID: \"0a6703e0-1fac-4734-98ac-88f6163fdaae\") " Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.667542 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp8dl\" (UniqueName: \"kubernetes.io/projected/188f566f-7d4a-4b9f-b74d-bbee761c0bea-kube-api-access-sp8dl\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.667561 4991 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.667574 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/188f566f-7d4a-4b9f-b74d-bbee761c0bea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.667587 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/188f566f-7d4a-4b9f-b74d-bbee761c0bea-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.667600 4991 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.667613 4991 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/188f566f-7d4a-4b9f-b74d-bbee761c0bea-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.667628 4991 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/188f566f-7d4a-4b9f-b74d-bbee761c0bea-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.670869 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a6703e0-1fac-4734-98ac-88f6163fdaae-kube-api-access-lxshx" (OuterVolumeSpecName: "kube-api-access-lxshx") pod "0a6703e0-1fac-4734-98ac-88f6163fdaae" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae"). InnerVolumeSpecName "kube-api-access-lxshx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.672442 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0a6703e0-1fac-4734-98ac-88f6163fdaae" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.708688 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0a6703e0-1fac-4734-98ac-88f6163fdaae" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.711153 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0a6703e0-1fac-4734-98ac-88f6163fdaae" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.721076 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a6703e0-1fac-4734-98ac-88f6163fdaae" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.754431 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config" (OuterVolumeSpecName: "config") pod "0a6703e0-1fac-4734-98ac-88f6163fdaae" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.769201 4991 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.769227 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxshx\" (UniqueName: \"kubernetes.io/projected/0a6703e0-1fac-4734-98ac-88f6163fdaae-kube-api-access-lxshx\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.769239 4991 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.769251 4991 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-config\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.769261 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.769270 4991 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.770321 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0a6703e0-1fac-4734-98ac-88f6163fdaae" (UID: "0a6703e0-1fac-4734-98ac-88f6163fdaae"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:44 crc kubenswrapper[4991]: I1006 08:42:44.870009 4991 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6703e0-1fac-4734-98ac-88f6163fdaae-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.063440 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.175247 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9448\" (UniqueName: \"kubernetes.io/projected/48f4202b-6558-4fe3-8fcc-732aa1a88e60-kube-api-access-d9448\") pod \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\" (UID: \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\") " Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.175631 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f4202b-6558-4fe3-8fcc-732aa1a88e60-combined-ca-bundle\") pod \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\" (UID: \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\") " Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.175744 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f4202b-6558-4fe3-8fcc-732aa1a88e60-config-data\") pod \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\" (UID: \"48f4202b-6558-4fe3-8fcc-732aa1a88e60\") " Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.191887 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f4202b-6558-4fe3-8fcc-732aa1a88e60-kube-api-access-d9448" (OuterVolumeSpecName: "kube-api-access-d9448") pod "48f4202b-6558-4fe3-8fcc-732aa1a88e60" (UID: "48f4202b-6558-4fe3-8fcc-732aa1a88e60"). InnerVolumeSpecName "kube-api-access-d9448". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.195119 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f4202b-6558-4fe3-8fcc-732aa1a88e60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48f4202b-6558-4fe3-8fcc-732aa1a88e60" (UID: "48f4202b-6558-4fe3-8fcc-732aa1a88e60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.199651 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f4202b-6558-4fe3-8fcc-732aa1a88e60-config-data" (OuterVolumeSpecName: "config-data") pod "48f4202b-6558-4fe3-8fcc-732aa1a88e60" (UID: "48f4202b-6558-4fe3-8fcc-732aa1a88e60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.254564 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" path="/var/lib/kubelet/pods/1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1/volumes" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.255173 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a7066c-5143-43ab-b642-81f461a9c1f4" path="/var/lib/kubelet/pods/51a7066c-5143-43ab-b642-81f461a9c1f4/volumes" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.256372 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c6aca4-4fd0-4d42-bbe2-4b6e91643503" path="/var/lib/kubelet/pods/53c6aca4-4fd0-4d42-bbe2-4b6e91643503/volumes" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.256957 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" path="/var/lib/kubelet/pods/79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb/volumes" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.277451 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9448\" (UniqueName: \"kubernetes.io/projected/48f4202b-6558-4fe3-8fcc-732aa1a88e60-kube-api-access-d9448\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.277484 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f4202b-6558-4fe3-8fcc-732aa1a88e60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.277495 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f4202b-6558-4fe3-8fcc-732aa1a88e60-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.453387 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-jklxx_188f566f-7d4a-4b9f-b74d-bbee761c0bea/ovn-controller/0.log" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.453834 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jklxx" event={"ID":"188f566f-7d4a-4b9f-b74d-bbee761c0bea","Type":"ContainerDied","Data":"18957892ea4516db63f3ea0c7c12689f3d3ac981b8c90d15a7a673283632396e"} Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.453876 4991 scope.go:117] "RemoveContainer" containerID="9ce5d29d20258b039798707827257132b489fc0eb7a72cd595e6df3602e635af" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.453890 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jklxx" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.455973 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.456181 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"48f4202b-6558-4fe3-8fcc-732aa1a88e60","Type":"ContainerDied","Data":"70e67b9801c80b2624076c47d8258dd45b7659f413ca1ca9cb329dd5fe8c2fec"} Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.463387 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7988dccf5c-j9ll7" event={"ID":"0a6703e0-1fac-4734-98ac-88f6163fdaae","Type":"ContainerDied","Data":"bafafb918c15f75eb2e131f6d4885779b16767048667dbc16d890cfa68fdaa1f"} Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.463424 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7988dccf5c-j9ll7" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.486152 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jklxx"] Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.499652 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-jklxx"] Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.511492 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7988dccf5c-j9ll7"] Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.517068 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7988dccf5c-j9ll7"] Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.522409 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.528054 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.536662 4991 scope.go:117] "RemoveContainer" containerID="4f4325397287518c3ecb285a52c75cc737cf34c7fece8ee912a41c376bf55696" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.552655 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="fded3e15-f946-4f86-bed4-2c4a3262395a" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.193:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.565932 4991 scope.go:117] "RemoveContainer" containerID="93e5b235f20e302b6749df9897200518a9608b53c7db75afd7a755bd7c31a9e2" Oct 06 08:42:45 crc kubenswrapper[4991]: I1006 08:42:45.624829 4991 scope.go:117] "RemoveContainer" containerID="d0aac78aa43c86da1a2d4708b970a7fa2c38a878adf032b4bc160cf815163a9d" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.499060 4991 generic.go:334] "Generic (PLEG): container finished" podID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerID="73a68671b100b61405ad43b9f475805bafe221206d22cbb30d84e206645d9fff" exitCode=0 Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.499355 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9160ed8e-9be5-4d38-b9a0-7138dfecc506","Type":"ContainerDied","Data":"73a68671b100b61405ad43b9f475805bafe221206d22cbb30d84e206645d9fff"} Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.728578 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.798832 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9160ed8e-9be5-4d38-b9a0-7138dfecc506-log-httpd\") pod \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.798994 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qv64\" (UniqueName: \"kubernetes.io/projected/9160ed8e-9be5-4d38-b9a0-7138dfecc506-kube-api-access-5qv64\") pod \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.799061 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-ceilometer-tls-certs\") pod \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.799129 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-sg-core-conf-yaml\") pod \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.799183 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9160ed8e-9be5-4d38-b9a0-7138dfecc506-run-httpd\") pod \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.799279 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-combined-ca-bundle\") pod \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.799430 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-config-data\") pod \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.799520 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-scripts\") pod \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\" (UID: \"9160ed8e-9be5-4d38-b9a0-7138dfecc506\") " Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.800277 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9160ed8e-9be5-4d38-b9a0-7138dfecc506-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9160ed8e-9be5-4d38-b9a0-7138dfecc506" (UID: "9160ed8e-9be5-4d38-b9a0-7138dfecc506"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.801165 4991 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9160ed8e-9be5-4d38-b9a0-7138dfecc506-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.801658 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9160ed8e-9be5-4d38-b9a0-7138dfecc506-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9160ed8e-9be5-4d38-b9a0-7138dfecc506" (UID: "9160ed8e-9be5-4d38-b9a0-7138dfecc506"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.808700 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9160ed8e-9be5-4d38-b9a0-7138dfecc506-kube-api-access-5qv64" (OuterVolumeSpecName: "kube-api-access-5qv64") pod "9160ed8e-9be5-4d38-b9a0-7138dfecc506" (UID: "9160ed8e-9be5-4d38-b9a0-7138dfecc506"). InnerVolumeSpecName "kube-api-access-5qv64". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.825508 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-scripts" (OuterVolumeSpecName: "scripts") pod "9160ed8e-9be5-4d38-b9a0-7138dfecc506" (UID: "9160ed8e-9be5-4d38-b9a0-7138dfecc506"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.842763 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9160ed8e-9be5-4d38-b9a0-7138dfecc506" (UID: "9160ed8e-9be5-4d38-b9a0-7138dfecc506"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.856584 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9160ed8e-9be5-4d38-b9a0-7138dfecc506" (UID: "9160ed8e-9be5-4d38-b9a0-7138dfecc506"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.883619 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9160ed8e-9be5-4d38-b9a0-7138dfecc506" (UID: "9160ed8e-9be5-4d38-b9a0-7138dfecc506"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.902140 4991 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9160ed8e-9be5-4d38-b9a0-7138dfecc506-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.902187 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qv64\" (UniqueName: \"kubernetes.io/projected/9160ed8e-9be5-4d38-b9a0-7138dfecc506-kube-api-access-5qv64\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.902202 4991 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.902214 4991 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.902226 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.902235 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:46 crc kubenswrapper[4991]: I1006 08:42:46.919070 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-config-data" (OuterVolumeSpecName: "config-data") pod "9160ed8e-9be5-4d38-b9a0-7138dfecc506" (UID: "9160ed8e-9be5-4d38-b9a0-7138dfecc506"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:42:47 crc kubenswrapper[4991]: I1006 08:42:47.004321 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9160ed8e-9be5-4d38-b9a0-7138dfecc506-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:47 crc kubenswrapper[4991]: E1006 08:42:47.078595 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:47 crc kubenswrapper[4991]: E1006 08:42:47.078923 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:47 crc kubenswrapper[4991]: E1006 08:42:47.079210 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:47 crc kubenswrapper[4991]: E1006 08:42:47.079246 4991 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server" Oct 06 08:42:47 crc kubenswrapper[4991]: E1006 08:42:47.079979 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:47 crc kubenswrapper[4991]: E1006 08:42:47.081249 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:47 crc kubenswrapper[4991]: E1006 08:42:47.082406 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:47 crc kubenswrapper[4991]: E1006 08:42:47.082487 4991 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovs-vswitchd" Oct 06 08:42:47 crc kubenswrapper[4991]: I1006 08:42:47.256237 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a6703e0-1fac-4734-98ac-88f6163fdaae" path="/var/lib/kubelet/pods/0a6703e0-1fac-4734-98ac-88f6163fdaae/volumes" Oct 06 08:42:47 crc kubenswrapper[4991]: I1006 08:42:47.256868 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="188f566f-7d4a-4b9f-b74d-bbee761c0bea" path="/var/lib/kubelet/pods/188f566f-7d4a-4b9f-b74d-bbee761c0bea/volumes" Oct 06 08:42:47 crc kubenswrapper[4991]: I1006 08:42:47.257571 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f4202b-6558-4fe3-8fcc-732aa1a88e60" path="/var/lib/kubelet/pods/48f4202b-6558-4fe3-8fcc-732aa1a88e60/volumes" Oct 06 08:42:47 crc kubenswrapper[4991]: I1006 08:42:47.511017 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9160ed8e-9be5-4d38-b9a0-7138dfecc506","Type":"ContainerDied","Data":"9fb4f55ebe69e581e9ff3c3ba36973801afec20da07defb9f5e458675c816932"} Oct 06 08:42:47 crc kubenswrapper[4991]: I1006 08:42:47.511110 4991 scope.go:117] "RemoveContainer" containerID="f562f33940beff3b6675303b795947ca3b1c6407fc9bbc1512ece13bd67badb0" Oct 06 08:42:47 crc kubenswrapper[4991]: I1006 08:42:47.511189 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 08:42:47 crc kubenswrapper[4991]: I1006 08:42:47.533092 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:42:47 crc kubenswrapper[4991]: I1006 08:42:47.538058 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 08:42:47 crc kubenswrapper[4991]: I1006 08:42:47.546989 4991 scope.go:117] "RemoveContainer" containerID="0d50a1a51fb76b15129da6a601cd7086c571cfcb88c3e73e801b6de71d603ab5" Oct 06 08:42:47 crc kubenswrapper[4991]: I1006 08:42:47.580891 4991 scope.go:117] "RemoveContainer" containerID="73a68671b100b61405ad43b9f475805bafe221206d22cbb30d84e206645d9fff" Oct 06 08:42:47 crc kubenswrapper[4991]: I1006 08:42:47.619434 4991 scope.go:117] "RemoveContainer" containerID="5649b7a21637b90f14805bd6598872f1dc5add87d3dbf7ebd86b1d9b50e3aaf8" Oct 06 08:42:48 crc kubenswrapper[4991]: I1006 08:42:48.040359 4991 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: i/o timeout" Oct 06 08:42:49 crc kubenswrapper[4991]: I1006 08:42:49.259957 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" path="/var/lib/kubelet/pods/9160ed8e-9be5-4d38-b9a0-7138dfecc506/volumes" Oct 06 08:42:52 crc kubenswrapper[4991]: E1006 08:42:52.078592 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:52 crc kubenswrapper[4991]: E1006 08:42:52.079441 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:52 crc kubenswrapper[4991]: E1006 08:42:52.079752 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:52 crc kubenswrapper[4991]: E1006 08:42:52.079781 4991 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server" Oct 06 08:42:52 crc kubenswrapper[4991]: E1006 08:42:52.080505 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:52 crc kubenswrapper[4991]: E1006 08:42:52.082759 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:52 crc kubenswrapper[4991]: E1006 08:42:52.084436 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:52 crc kubenswrapper[4991]: E1006 08:42:52.084502 4991 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovs-vswitchd" Oct 06 08:42:53 crc kubenswrapper[4991]: I1006 08:42:53.114228 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:53 crc kubenswrapper[4991]: I1006 08:42:53.163042 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bc2q"] Oct 06 08:42:53 crc kubenswrapper[4991]: I1006 08:42:53.572810 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8bc2q" podUID="544d772e-ee45-4bd6-9895-07dec1dc3ff1" containerName="registry-server" containerID="cri-o://11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1" gracePeriod=2 Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.078737 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.104512 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/544d772e-ee45-4bd6-9895-07dec1dc3ff1-utilities\") pod \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\" (UID: \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\") " Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.104725 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhc2r\" (UniqueName: \"kubernetes.io/projected/544d772e-ee45-4bd6-9895-07dec1dc3ff1-kube-api-access-vhc2r\") pod \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\" (UID: \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\") " Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.104768 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/544d772e-ee45-4bd6-9895-07dec1dc3ff1-catalog-content\") pod \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\" (UID: \"544d772e-ee45-4bd6-9895-07dec1dc3ff1\") " Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.106329 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/544d772e-ee45-4bd6-9895-07dec1dc3ff1-utilities" (OuterVolumeSpecName: "utilities") pod "544d772e-ee45-4bd6-9895-07dec1dc3ff1" (UID: "544d772e-ee45-4bd6-9895-07dec1dc3ff1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.115380 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/544d772e-ee45-4bd6-9895-07dec1dc3ff1-kube-api-access-vhc2r" (OuterVolumeSpecName: "kube-api-access-vhc2r") pod "544d772e-ee45-4bd6-9895-07dec1dc3ff1" (UID: "544d772e-ee45-4bd6-9895-07dec1dc3ff1"). InnerVolumeSpecName "kube-api-access-vhc2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.125565 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/544d772e-ee45-4bd6-9895-07dec1dc3ff1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "544d772e-ee45-4bd6-9895-07dec1dc3ff1" (UID: "544d772e-ee45-4bd6-9895-07dec1dc3ff1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.207175 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhc2r\" (UniqueName: \"kubernetes.io/projected/544d772e-ee45-4bd6-9895-07dec1dc3ff1-kube-api-access-vhc2r\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.207210 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/544d772e-ee45-4bd6-9895-07dec1dc3ff1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.207227 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/544d772e-ee45-4bd6-9895-07dec1dc3ff1-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.587630 4991 generic.go:334] "Generic (PLEG): container finished" podID="544d772e-ee45-4bd6-9895-07dec1dc3ff1" containerID="11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1" exitCode=0 Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.587682 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bc2q" event={"ID":"544d772e-ee45-4bd6-9895-07dec1dc3ff1","Type":"ContainerDied","Data":"11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1"} Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.587722 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bc2q" event={"ID":"544d772e-ee45-4bd6-9895-07dec1dc3ff1","Type":"ContainerDied","Data":"4534c655d15de1ec2556c633c215e41448bf05481ce559ec7dd1bfb799d8efa7"} Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.587750 4991 scope.go:117] "RemoveContainer" containerID="11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.587934 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bc2q" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.626232 4991 scope.go:117] "RemoveContainer" containerID="a69224037bb7a4b312d1895a54ffafcd696e1071bc2b1bfdd04a50ad1263701f" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.646477 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bc2q"] Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.662191 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bc2q"] Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.673486 4991 scope.go:117] "RemoveContainer" containerID="38fc13c77e37c9f1e00a7d2ea790418d81fb0aba7b509e48bc1695d437e50dda" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.782592 4991 scope.go:117] "RemoveContainer" containerID="11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1" Oct 06 08:42:54 crc kubenswrapper[4991]: E1006 08:42:54.783256 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1\": container with ID starting with 11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1 not found: ID does not exist" containerID="11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.783314 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1"} err="failed to get container status \"11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1\": rpc error: code = NotFound desc = could not find container \"11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1\": container with ID starting with 11312cdc876ae8db515af519ad2ea0febec027e77741f7937334b807de43f0a1 not found: ID does not exist" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.783339 4991 scope.go:117] "RemoveContainer" containerID="a69224037bb7a4b312d1895a54ffafcd696e1071bc2b1bfdd04a50ad1263701f" Oct 06 08:42:54 crc kubenswrapper[4991]: E1006 08:42:54.783640 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69224037bb7a4b312d1895a54ffafcd696e1071bc2b1bfdd04a50ad1263701f\": container with ID starting with a69224037bb7a4b312d1895a54ffafcd696e1071bc2b1bfdd04a50ad1263701f not found: ID does not exist" containerID="a69224037bb7a4b312d1895a54ffafcd696e1071bc2b1bfdd04a50ad1263701f" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.783688 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69224037bb7a4b312d1895a54ffafcd696e1071bc2b1bfdd04a50ad1263701f"} err="failed to get container status \"a69224037bb7a4b312d1895a54ffafcd696e1071bc2b1bfdd04a50ad1263701f\": rpc error: code = NotFound desc = could not find container \"a69224037bb7a4b312d1895a54ffafcd696e1071bc2b1bfdd04a50ad1263701f\": container with ID starting with a69224037bb7a4b312d1895a54ffafcd696e1071bc2b1bfdd04a50ad1263701f not found: ID does not exist" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.783705 4991 scope.go:117] "RemoveContainer" containerID="38fc13c77e37c9f1e00a7d2ea790418d81fb0aba7b509e48bc1695d437e50dda" Oct 06 08:42:54 crc kubenswrapper[4991]: E1006 08:42:54.784034 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38fc13c77e37c9f1e00a7d2ea790418d81fb0aba7b509e48bc1695d437e50dda\": container with ID starting with 38fc13c77e37c9f1e00a7d2ea790418d81fb0aba7b509e48bc1695d437e50dda not found: ID does not exist" containerID="38fc13c77e37c9f1e00a7d2ea790418d81fb0aba7b509e48bc1695d437e50dda" Oct 06 08:42:54 crc kubenswrapper[4991]: I1006 08:42:54.784060 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38fc13c77e37c9f1e00a7d2ea790418d81fb0aba7b509e48bc1695d437e50dda"} err="failed to get container status \"38fc13c77e37c9f1e00a7d2ea790418d81fb0aba7b509e48bc1695d437e50dda\": rpc error: code = NotFound desc = could not find container \"38fc13c77e37c9f1e00a7d2ea790418d81fb0aba7b509e48bc1695d437e50dda\": container with ID starting with 38fc13c77e37c9f1e00a7d2ea790418d81fb0aba7b509e48bc1695d437e50dda not found: ID does not exist" Oct 06 08:42:55 crc kubenswrapper[4991]: I1006 08:42:55.259006 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="544d772e-ee45-4bd6-9895-07dec1dc3ff1" path="/var/lib/kubelet/pods/544d772e-ee45-4bd6-9895-07dec1dc3ff1/volumes" Oct 06 08:42:57 crc kubenswrapper[4991]: E1006 08:42:57.077975 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:57 crc kubenswrapper[4991]: E1006 08:42:57.078885 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:57 crc kubenswrapper[4991]: E1006 08:42:57.079346 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:42:57 crc kubenswrapper[4991]: E1006 08:42:57.079418 4991 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server" Oct 06 08:42:57 crc kubenswrapper[4991]: E1006 08:42:57.079633 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:57 crc kubenswrapper[4991]: E1006 08:42:57.082055 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:57 crc kubenswrapper[4991]: E1006 08:42:57.084785 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:42:57 crc kubenswrapper[4991]: E1006 08:42:57.085153 4991 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovs-vswitchd" Oct 06 08:42:57 crc kubenswrapper[4991]: I1006 08:42:57.529883 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:42:57 crc kubenswrapper[4991]: I1006 08:42:57.529995 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:43:02 crc kubenswrapper[4991]: E1006 08:43:02.078699 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:43:02 crc kubenswrapper[4991]: E1006 08:43:02.079854 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:43:02 crc kubenswrapper[4991]: E1006 08:43:02.080394 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:43:02 crc kubenswrapper[4991]: E1006 08:43:02.080608 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 06 08:43:02 crc kubenswrapper[4991]: E1006 08:43:02.080726 4991 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server" Oct 06 08:43:02 crc kubenswrapper[4991]: E1006 08:43:02.082751 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:43:02 crc kubenswrapper[4991]: E1006 08:43:02.085772 4991 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 06 08:43:02 crc kubenswrapper[4991]: E1006 08:43:02.085834 4991 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-5prwt" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovs-vswitchd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.178847 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-592vz"] Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.179769 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6703e0-1fac-4734-98ac-88f6163fdaae" containerName="neutron-api" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.179796 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6703e0-1fac-4734-98ac-88f6163fdaae" containerName="neutron-api" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.179821 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6" containerName="openstack-network-exporter" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.179834 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6" containerName="openstack-network-exporter" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.179849 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a7066c-5143-43ab-b642-81f461a9c1f4" containerName="openstack-network-exporter" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.179862 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a7066c-5143-43ab-b642-81f461a9c1f4" containerName="openstack-network-exporter" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.179878 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="proxy-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.179890 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="proxy-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.179905 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="sg-core" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.179917 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="sg-core" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.179935 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a24973-6ef6-4732-9a96-040ce646a707" containerName="glance-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.179949 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a24973-6ef6-4732-9a96-040ce646a707" containerName="glance-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.179965 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7f3760-250c-4e34-8bde-7e9218b711ff" containerName="barbican-keystone-listener-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.179977 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7f3760-250c-4e34-8bde-7e9218b711ff" containerName="barbican-keystone-listener-log" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180003 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="ceilometer-notification-agent" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180015 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="ceilometer-notification-agent" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180031 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2720ee8-eb06-4a0b-9bee-153b69ee769e" containerName="barbican-worker" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180044 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2720ee8-eb06-4a0b-9bee-153b69ee769e" containerName="barbican-worker" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180067 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305f56cb-d896-435c-ae06-4a407714b503" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180080 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="305f56cb-d896-435c-ae06-4a407714b503" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180097 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188f566f-7d4a-4b9f-b74d-bbee761c0bea" containerName="ovn-controller" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180110 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="188f566f-7d4a-4b9f-b74d-bbee761c0bea" containerName="ovn-controller" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180125 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="ceilometer-central-agent" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180137 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="ceilometer-central-agent" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180158 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f986ad-8a8d-44d3-b200-479a60f8b8b3" containerName="mysql-bootstrap" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180169 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f986ad-8a8d-44d3-b200-479a60f8b8b3" containerName="mysql-bootstrap" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180190 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c6aca4-4fd0-4d42-bbe2-4b6e91643503" containerName="rabbitmq" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180202 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c6aca4-4fd0-4d42-bbe2-4b6e91643503" containerName="rabbitmq" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180217 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" containerName="barbican-api-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180514 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" containerName="barbican-api-log" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180546 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerName="nova-metadata-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180560 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerName="nova-metadata-log" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180578 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="544d772e-ee45-4bd6-9895-07dec1dc3ff1" containerName="extract-content" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180591 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="544d772e-ee45-4bd6-9895-07dec1dc3ff1" containerName="extract-content" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180608 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b53689-326b-4f4c-a625-beec7a3631fa" containerName="nova-cell1-conductor-conductor" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180620 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b53689-326b-4f4c-a625-beec7a3631fa" containerName="nova-cell1-conductor-conductor" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180640 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" containerName="setup-container" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180654 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" containerName="setup-container" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180677 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f4202b-6558-4fe3-8fcc-732aa1a88e60" containerName="nova-scheduler-scheduler" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180690 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f4202b-6558-4fe3-8fcc-732aa1a88e60" containerName="nova-scheduler-scheduler" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180727 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" containerName="rabbitmq" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180739 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" containerName="rabbitmq" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180759 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50622552-6b5c-4af5-a457-09c526c54f3f" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180772 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="50622552-6b5c-4af5-a457-09c526c54f3f" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180792 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815c282e-cc40-4ff8-b3f8-155d9a91a20b" containerName="cinder-api" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180804 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="815c282e-cc40-4ff8-b3f8-155d9a91a20b" containerName="cinder-api" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180828 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" containerName="barbican-api" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180840 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" containerName="barbican-api" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180864 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2791937-a79f-4d99-b895-6d3ac79ba220" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180876 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2791937-a79f-4d99-b895-6d3ac79ba220" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180898 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b135498-feb3-4024-b655-92f403f55bb9" containerName="openstack-network-exporter" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180911 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b135498-feb3-4024-b655-92f403f55bb9" containerName="openstack-network-exporter" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180933 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801bcc07-7874-4eb8-8447-40178d80ea09" containerName="proxy-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180945 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="801bcc07-7874-4eb8-8447-40178d80ea09" containerName="proxy-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180967 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerName="nova-metadata-metadata" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.180979 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerName="nova-metadata-metadata" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.180996 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" containerName="keystone-api" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181008 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" containerName="keystone-api" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181024 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="697548ef-9b89-4827-a5f1-4e535ae94722" containerName="nova-cell0-conductor-conductor" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181037 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="697548ef-9b89-4827-a5f1-4e535ae94722" containerName="nova-cell0-conductor-conductor" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181051 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad6d483-bca3-4391-9e4c-290b6b15b1f4" containerName="ovsdbserver-nb" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181063 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad6d483-bca3-4391-9e4c-290b6b15b1f4" containerName="ovsdbserver-nb" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181085 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7f3760-250c-4e34-8bde-7e9218b711ff" containerName="barbican-keystone-listener" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181096 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7f3760-250c-4e34-8bde-7e9218b711ff" containerName="barbican-keystone-listener" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181113 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="544d772e-ee45-4bd6-9895-07dec1dc3ff1" containerName="extract-utilities" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181125 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="544d772e-ee45-4bd6-9895-07dec1dc3ff1" containerName="extract-utilities" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181138 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6703e0-1fac-4734-98ac-88f6163fdaae" containerName="neutron-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181150 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6703e0-1fac-4734-98ac-88f6163fdaae" containerName="neutron-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181166 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801bcc07-7874-4eb8-8447-40178d80ea09" containerName="proxy-server" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181178 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="801bcc07-7874-4eb8-8447-40178d80ea09" containerName="proxy-server" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181196 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e696d7-7767-4a92-9828-a189ffb52275" containerName="nova-api-api" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181400 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e696d7-7767-4a92-9828-a189ffb52275" containerName="nova-api-api" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181427 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e696d7-7767-4a92-9828-a189ffb52275" containerName="nova-api-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181438 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e696d7-7767-4a92-9828-a189ffb52275" containerName="nova-api-log" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181458 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb6a9a7-403e-4dc9-903c-349391d84efb" containerName="placement-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181469 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb6a9a7-403e-4dc9-903c-349391d84efb" containerName="placement-log" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181487 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="544d772e-ee45-4bd6-9895-07dec1dc3ff1" containerName="registry-server" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181499 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="544d772e-ee45-4bd6-9895-07dec1dc3ff1" containerName="registry-server" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181519 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c6aca4-4fd0-4d42-bbe2-4b6e91643503" containerName="setup-container" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181532 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c6aca4-4fd0-4d42-bbe2-4b6e91643503" containerName="setup-container" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181551 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2720ee8-eb06-4a0b-9bee-153b69ee769e" containerName="barbican-worker-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181563 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2720ee8-eb06-4a0b-9bee-153b69ee769e" containerName="barbican-worker-log" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181582 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815c282e-cc40-4ff8-b3f8-155d9a91a20b" containerName="cinder-api-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181593 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="815c282e-cc40-4ff8-b3f8-155d9a91a20b" containerName="cinder-api-log" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181608 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d06311c-e246-4d3d-ba9c-388cb800ac4f" containerName="init" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181621 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d06311c-e246-4d3d-ba9c-388cb800ac4f" containerName="init" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181638 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f986ad-8a8d-44d3-b200-479a60f8b8b3" containerName="galera" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181649 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f986ad-8a8d-44d3-b200-479a60f8b8b3" containerName="galera" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181669 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a7066c-5143-43ab-b642-81f461a9c1f4" containerName="ovn-northd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181681 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a7066c-5143-43ab-b642-81f461a9c1f4" containerName="ovn-northd" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181702 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1297ce-72cf-4b07-a66d-826e8e9c1663" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181716 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1297ce-72cf-4b07-a66d-826e8e9c1663" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181741 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a24973-6ef6-4732-9a96-040ce646a707" containerName="glance-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181758 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a24973-6ef6-4732-9a96-040ce646a707" containerName="glance-log" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181786 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa57b1fb-c743-4137-9501-a0110f385b1c" containerName="glance-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181802 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa57b1fb-c743-4137-9501-a0110f385b1c" containerName="glance-log" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181823 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa57b1fb-c743-4137-9501-a0110f385b1c" containerName="glance-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181839 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa57b1fb-c743-4137-9501-a0110f385b1c" containerName="glance-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181867 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033164fc-5a6f-4b9d-8c3a-1e4242078c9e" containerName="memcached" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181879 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="033164fc-5a6f-4b9d-8c3a-1e4242078c9e" containerName="memcached" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181894 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb6a9a7-403e-4dc9-903c-349391d84efb" containerName="placement-api" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181906 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb6a9a7-403e-4dc9-903c-349391d84efb" containerName="placement-api" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181920 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157f3f65-3397-4a2d-98ea-1ae5897c7a76" containerName="galera" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181932 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="157f3f65-3397-4a2d-98ea-1ae5897c7a76" containerName="galera" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.181955 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad6d483-bca3-4391-9e4c-290b6b15b1f4" containerName="openstack-network-exporter" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.181997 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad6d483-bca3-4391-9e4c-290b6b15b1f4" containerName="openstack-network-exporter" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.182018 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fded3e15-f946-4f86-bed4-2c4a3262395a" containerName="kube-state-metrics" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182031 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="fded3e15-f946-4f86-bed4-2c4a3262395a" containerName="kube-state-metrics" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.182054 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d06311c-e246-4d3d-ba9c-388cb800ac4f" containerName="dnsmasq-dns" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182069 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d06311c-e246-4d3d-ba9c-388cb800ac4f" containerName="dnsmasq-dns" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.182090 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157f3f65-3397-4a2d-98ea-1ae5897c7a76" containerName="mysql-bootstrap" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182104 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="157f3f65-3397-4a2d-98ea-1ae5897c7a76" containerName="mysql-bootstrap" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.182124 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87faa73b-1148-48ae-88f4-3bdd06898658" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182137 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="87faa73b-1148-48ae-88f4-3bdd06898658" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.182156 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4175b5d-7866-481a-a923-1ae5f3307195" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182169 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4175b5d-7866-481a-a923-1ae5f3307195" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 08:43:04 crc kubenswrapper[4991]: E1006 08:43:04.182192 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b135498-feb3-4024-b655-92f403f55bb9" containerName="ovsdbserver-sb" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182203 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b135498-feb3-4024-b655-92f403f55bb9" containerName="ovsdbserver-sb" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182712 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b135498-feb3-4024-b655-92f403f55bb9" containerName="ovsdbserver-sb" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182733 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2791937-a79f-4d99-b895-6d3ac79ba220" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182753 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="815c282e-cc40-4ff8-b3f8-155d9a91a20b" containerName="cinder-api" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182771 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="50622552-6b5c-4af5-a457-09c526c54f3f" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182793 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a7066c-5143-43ab-b642-81f461a9c1f4" containerName="ovn-northd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182815 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa57b1fb-c743-4137-9501-a0110f385b1c" containerName="glance-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182838 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f4202b-6558-4fe3-8fcc-732aa1a88e60" containerName="nova-scheduler-scheduler" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182865 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" containerName="barbican-api" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182888 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ad30dfa-4735-4ef3-8fcc-4b6f25eefcd6" containerName="openstack-network-exporter" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182906 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="305f56cb-d896-435c-ae06-4a407714b503" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182925 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="033164fc-5a6f-4b9d-8c3a-1e4242078c9e" containerName="memcached" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182953 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e696d7-7767-4a92-9828-a189ffb52275" containerName="nova-api-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182977 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="544d772e-ee45-4bd6-9895-07dec1dc3ff1" containerName="registry-server" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.182999 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b135498-feb3-4024-b655-92f403f55bb9" containerName="openstack-network-exporter" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183018 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6703e0-1fac-4734-98ac-88f6163fdaae" containerName="neutron-api" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183037 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2720ee8-eb06-4a0b-9bee-153b69ee769e" containerName="barbican-worker" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183051 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a24973-6ef6-4732-9a96-040ce646a707" containerName="glance-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183067 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="697548ef-9b89-4827-a5f1-4e535ae94722" containerName="nova-cell0-conductor-conductor" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183086 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="proxy-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183102 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="fded3e15-f946-4f86-bed4-2c4a3262395a" containerName="kube-state-metrics" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183113 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ad6d483-bca3-4391-9e4c-290b6b15b1f4" containerName="ovsdbserver-nb" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183130 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e8ba650-c3ef-45bd-ac9b-daaa4889c2f1" containerName="rabbitmq" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183147 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="801bcc07-7874-4eb8-8447-40178d80ea09" containerName="proxy-server" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183167 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="87faa73b-1148-48ae-88f4-3bdd06898658" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183187 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="815c282e-cc40-4ff8-b3f8-155d9a91a20b" containerName="cinder-api-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183204 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d06311c-e246-4d3d-ba9c-388cb800ac4f" containerName="dnsmasq-dns" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183224 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6703e0-1fac-4734-98ac-88f6163fdaae" containerName="neutron-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183240 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e696d7-7767-4a92-9828-a189ffb52275" containerName="nova-api-api" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183255 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="ceilometer-central-agent" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183272 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1297ce-72cf-4b07-a66d-826e8e9c1663" containerName="mariadb-account-delete" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183286 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="188f566f-7d4a-4b9f-b74d-bbee761c0bea" containerName="ovn-controller" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183327 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9be32ba-d183-4fd5-ba8b-63f79c973c81" containerName="barbican-api-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183343 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa57b1fb-c743-4137-9501-a0110f385b1c" containerName="glance-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183361 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerName="nova-metadata-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183380 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a7066c-5143-43ab-b642-81f461a9c1f4" containerName="openstack-network-exporter" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183395 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="ceilometer-notification-agent" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183411 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7f3760-250c-4e34-8bde-7e9218b711ff" containerName="barbican-keystone-listener" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183424 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a24973-6ef6-4732-9a96-040ce646a707" containerName="glance-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183444 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb6a9a7-403e-4dc9-903c-349391d84efb" containerName="placement-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183458 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7f3760-250c-4e34-8bde-7e9218b711ff" containerName="barbican-keystone-listener-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183473 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ad6d483-bca3-4391-9e4c-290b6b15b1f4" containerName="openstack-network-exporter" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183495 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f986ad-8a8d-44d3-b200-479a60f8b8b3" containerName="galera" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183513 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e2b1c5-03aa-4472-9002-7daf936edc67" containerName="nova-metadata-metadata" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183530 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="157f3f65-3397-4a2d-98ea-1ae5897c7a76" containerName="galera" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183549 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b53689-326b-4f4c-a625-beec7a3631fa" containerName="nova-cell1-conductor-conductor" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183568 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="801bcc07-7874-4eb8-8447-40178d80ea09" containerName="proxy-httpd" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183582 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="9160ed8e-9be5-4d38-b9a0-7138dfecc506" containerName="sg-core" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183599 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2720ee8-eb06-4a0b-9bee-153b69ee769e" containerName="barbican-worker-log" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183616 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e5c10e-c9bd-4a93-a060-4bd49e8cb8eb" containerName="keystone-api" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183632 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb6a9a7-403e-4dc9-903c-349391d84efb" containerName="placement-api" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183647 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4175b5d-7866-481a-a923-1ae5f3307195" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.183662 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c6aca4-4fd0-4d42-bbe2-4b6e91643503" containerName="rabbitmq" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.185657 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.202027 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-592vz"] Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.270575 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-catalog-content\") pod \"redhat-operators-592vz\" (UID: \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\") " pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.271009 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-utilities\") pod \"redhat-operators-592vz\" (UID: \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\") " pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.271134 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn645\" (UniqueName: \"kubernetes.io/projected/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-kube-api-access-wn645\") pod \"redhat-operators-592vz\" (UID: \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\") " pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.372652 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-utilities\") pod \"redhat-operators-592vz\" (UID: \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\") " pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.372707 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn645\" (UniqueName: \"kubernetes.io/projected/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-kube-api-access-wn645\") pod \"redhat-operators-592vz\" (UID: \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\") " pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.372730 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-catalog-content\") pod \"redhat-operators-592vz\" (UID: \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\") " pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.373242 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-utilities\") pod \"redhat-operators-592vz\" (UID: \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\") " pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.373279 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-catalog-content\") pod \"redhat-operators-592vz\" (UID: \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\") " pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.401198 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn645\" (UniqueName: \"kubernetes.io/projected/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-kube-api-access-wn645\") pod \"redhat-operators-592vz\" (UID: \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\") " pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:04 crc kubenswrapper[4991]: I1006 08:43:04.547024 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:04 crc kubenswrapper[4991]: W1006 08:43:04.997987 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5eb2f7c_6c04_4736_a677_4fd1a7571c9c.slice/crio-3af16025e9df4c3a3099430d9140237743b7e1a097353d8c3c878c11efdc1bf0 WatchSource:0}: Error finding container 3af16025e9df4c3a3099430d9140237743b7e1a097353d8c3c878c11efdc1bf0: Status 404 returned error can't find the container with id 3af16025e9df4c3a3099430d9140237743b7e1a097353d8c3c878c11efdc1bf0 Oct 06 08:43:05 crc kubenswrapper[4991]: I1006 08:43:05.000598 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-592vz"] Oct 06 08:43:05 crc kubenswrapper[4991]: I1006 08:43:05.741998 4991 generic.go:334] "Generic (PLEG): container finished" podID="4dd2d34c-a29e-47b8-98b4-f75fffb11673" containerID="0d1610527cf8b6f50326a4d6ebe66a1e52c2dc1024a98e810b007cc8199eb0b7" exitCode=137 Oct 06 08:43:05 crc kubenswrapper[4991]: I1006 08:43:05.742056 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4dd2d34c-a29e-47b8-98b4-f75fffb11673","Type":"ContainerDied","Data":"0d1610527cf8b6f50326a4d6ebe66a1e52c2dc1024a98e810b007cc8199eb0b7"} Oct 06 08:43:05 crc kubenswrapper[4991]: I1006 08:43:05.753782 4991 generic.go:334] "Generic (PLEG): container finished" podID="14cb118a-286e-4ded-890d-fc788f9361f4" containerID="eb28a1e65b323917d5e53d7d3619b4b0894ce6380fa661067a656f0faf1a3966" exitCode=137 Oct 06 08:43:05 crc kubenswrapper[4991]: I1006 08:43:05.753841 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"eb28a1e65b323917d5e53d7d3619b4b0894ce6380fa661067a656f0faf1a3966"} Oct 06 08:43:05 crc kubenswrapper[4991]: I1006 08:43:05.757849 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5prwt_63c7d8f9-5c85-4999-b60b-517b03ff5992/ovs-vswitchd/0.log" Oct 06 08:43:05 crc kubenswrapper[4991]: I1006 08:43:05.758492 4991 generic.go:334] "Generic (PLEG): container finished" podID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" exitCode=137 Oct 06 08:43:05 crc kubenswrapper[4991]: I1006 08:43:05.758551 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5prwt" event={"ID":"63c7d8f9-5c85-4999-b60b-517b03ff5992","Type":"ContainerDied","Data":"6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159"} Oct 06 08:43:05 crc kubenswrapper[4991]: I1006 08:43:05.761165 4991 generic.go:334] "Generic (PLEG): container finished" podID="b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" containerID="5740ccdbe011f4b729456154fb3e69eb559c4d332b03e6945b189471bd34be4f" exitCode=0 Oct 06 08:43:05 crc kubenswrapper[4991]: I1006 08:43:05.761211 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-592vz" event={"ID":"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c","Type":"ContainerDied","Data":"5740ccdbe011f4b729456154fb3e69eb559c4d332b03e6945b189471bd34be4f"} Oct 06 08:43:05 crc kubenswrapper[4991]: I1006 08:43:05.761238 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-592vz" event={"ID":"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c","Type":"ContainerStarted","Data":"3af16025e9df4c3a3099430d9140237743b7e1a097353d8c3c878c11efdc1bf0"} Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.063787 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5prwt_63c7d8f9-5c85-4999-b60b-517b03ff5992/ovs-vswitchd/0.log" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.064829 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.070405 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.106448 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207071 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift\") pod \"14cb118a-286e-4ded-890d-fc788f9361f4\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207370 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63c7d8f9-5c85-4999-b60b-517b03ff5992-scripts\") pod \"63c7d8f9-5c85-4999-b60b-517b03ff5992\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207519 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-log\") pod \"63c7d8f9-5c85-4999-b60b-517b03ff5992\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207576 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-log" (OuterVolumeSpecName: "var-log") pod "63c7d8f9-5c85-4999-b60b-517b03ff5992" (UID: "63c7d8f9-5c85-4999-b60b-517b03ff5992"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207616 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-etc-ovs\") pod \"63c7d8f9-5c85-4999-b60b-517b03ff5992\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207694 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dd2d34c-a29e-47b8-98b4-f75fffb11673-etc-machine-id\") pod \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207736 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpf54\" (UniqueName: \"kubernetes.io/projected/63c7d8f9-5c85-4999-b60b-517b03ff5992-kube-api-access-rpf54\") pod \"63c7d8f9-5c85-4999-b60b-517b03ff5992\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207766 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-lib\") pod \"63c7d8f9-5c85-4999-b60b-517b03ff5992\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207792 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-combined-ca-bundle\") pod \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207817 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-scripts\") pod \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207841 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/14cb118a-286e-4ded-890d-fc788f9361f4-cache\") pod \"14cb118a-286e-4ded-890d-fc788f9361f4\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207888 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"14cb118a-286e-4ded-890d-fc788f9361f4\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207937 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-run\") pod \"63c7d8f9-5c85-4999-b60b-517b03ff5992\" (UID: \"63c7d8f9-5c85-4999-b60b-517b03ff5992\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.207968 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22hq5\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-kube-api-access-22hq5\") pod \"14cb118a-286e-4ded-890d-fc788f9361f4\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208013 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-config-data\") pod \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208072 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/14cb118a-286e-4ded-890d-fc788f9361f4-lock\") pod \"14cb118a-286e-4ded-890d-fc788f9361f4\" (UID: \"14cb118a-286e-4ded-890d-fc788f9361f4\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208103 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dbrj\" (UniqueName: \"kubernetes.io/projected/4dd2d34c-a29e-47b8-98b4-f75fffb11673-kube-api-access-7dbrj\") pod \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208146 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-config-data-custom\") pod \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\" (UID: \"4dd2d34c-a29e-47b8-98b4-f75fffb11673\") " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208220 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4dd2d34c-a29e-47b8-98b4-f75fffb11673-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4dd2d34c-a29e-47b8-98b4-f75fffb11673" (UID: "4dd2d34c-a29e-47b8-98b4-f75fffb11673"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208326 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63c7d8f9-5c85-4999-b60b-517b03ff5992-scripts" (OuterVolumeSpecName: "scripts") pod "63c7d8f9-5c85-4999-b60b-517b03ff5992" (UID: "63c7d8f9-5c85-4999-b60b-517b03ff5992"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208288 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-lib" (OuterVolumeSpecName: "var-lib") pod "63c7d8f9-5c85-4999-b60b-517b03ff5992" (UID: "63c7d8f9-5c85-4999-b60b-517b03ff5992"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208408 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "63c7d8f9-5c85-4999-b60b-517b03ff5992" (UID: "63c7d8f9-5c85-4999-b60b-517b03ff5992"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208441 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-run" (OuterVolumeSpecName: "var-run") pod "63c7d8f9-5c85-4999-b60b-517b03ff5992" (UID: "63c7d8f9-5c85-4999-b60b-517b03ff5992"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208716 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63c7d8f9-5c85-4999-b60b-517b03ff5992-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208742 4991 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-log\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208754 4991 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208765 4991 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dd2d34c-a29e-47b8-98b4-f75fffb11673-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208779 4991 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-lib\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208789 4991 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63c7d8f9-5c85-4999-b60b-517b03ff5992-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208787 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14cb118a-286e-4ded-890d-fc788f9361f4-lock" (OuterVolumeSpecName: "lock") pod "14cb118a-286e-4ded-890d-fc788f9361f4" (UID: "14cb118a-286e-4ded-890d-fc788f9361f4"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.208910 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14cb118a-286e-4ded-890d-fc788f9361f4-cache" (OuterVolumeSpecName: "cache") pod "14cb118a-286e-4ded-890d-fc788f9361f4" (UID: "14cb118a-286e-4ded-890d-fc788f9361f4"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.214146 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "14cb118a-286e-4ded-890d-fc788f9361f4" (UID: "14cb118a-286e-4ded-890d-fc788f9361f4"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.214231 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-kube-api-access-22hq5" (OuterVolumeSpecName: "kube-api-access-22hq5") pod "14cb118a-286e-4ded-890d-fc788f9361f4" (UID: "14cb118a-286e-4ded-890d-fc788f9361f4"). InnerVolumeSpecName "kube-api-access-22hq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.214393 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4dd2d34c-a29e-47b8-98b4-f75fffb11673" (UID: "4dd2d34c-a29e-47b8-98b4-f75fffb11673"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.214430 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-scripts" (OuterVolumeSpecName: "scripts") pod "4dd2d34c-a29e-47b8-98b4-f75fffb11673" (UID: "4dd2d34c-a29e-47b8-98b4-f75fffb11673"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.214732 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c7d8f9-5c85-4999-b60b-517b03ff5992-kube-api-access-rpf54" (OuterVolumeSpecName: "kube-api-access-rpf54") pod "63c7d8f9-5c85-4999-b60b-517b03ff5992" (UID: "63c7d8f9-5c85-4999-b60b-517b03ff5992"). InnerVolumeSpecName "kube-api-access-rpf54". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.215225 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "14cb118a-286e-4ded-890d-fc788f9361f4" (UID: "14cb118a-286e-4ded-890d-fc788f9361f4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.216227 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd2d34c-a29e-47b8-98b4-f75fffb11673-kube-api-access-7dbrj" (OuterVolumeSpecName: "kube-api-access-7dbrj") pod "4dd2d34c-a29e-47b8-98b4-f75fffb11673" (UID: "4dd2d34c-a29e-47b8-98b4-f75fffb11673"). InnerVolumeSpecName "kube-api-access-7dbrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.259936 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dd2d34c-a29e-47b8-98b4-f75fffb11673" (UID: "4dd2d34c-a29e-47b8-98b4-f75fffb11673"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.310031 4991 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.310067 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpf54\" (UniqueName: \"kubernetes.io/projected/63c7d8f9-5c85-4999-b60b-517b03ff5992-kube-api-access-rpf54\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.310078 4991 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.310087 4991 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.310096 4991 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/14cb118a-286e-4ded-890d-fc788f9361f4-cache\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.310129 4991 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.310138 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22hq5\" (UniqueName: \"kubernetes.io/projected/14cb118a-286e-4ded-890d-fc788f9361f4-kube-api-access-22hq5\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.310148 4991 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/14cb118a-286e-4ded-890d-fc788f9361f4-lock\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.310161 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dbrj\" (UniqueName: \"kubernetes.io/projected/4dd2d34c-a29e-47b8-98b4-f75fffb11673-kube-api-access-7dbrj\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.310172 4991 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.313169 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-config-data" (OuterVolumeSpecName: "config-data") pod "4dd2d34c-a29e-47b8-98b4-f75fffb11673" (UID: "4dd2d34c-a29e-47b8-98b4-f75fffb11673"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.325540 4991 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.411806 4991 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.411869 4991 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd2d34c-a29e-47b8-98b4-f75fffb11673-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.776507 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4dd2d34c-a29e-47b8-98b4-f75fffb11673","Type":"ContainerDied","Data":"01ff58e29358c6ee8636d1f23f1014260c33e2cf51712714c57202fc6db62ffa"} Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.776660 4991 scope.go:117] "RemoveContainer" containerID="d649d548626a4bd3bff872429af0bef8f3a02f2808a38286a2013d34229a5407" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.776850 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.795905 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"14cb118a-286e-4ded-890d-fc788f9361f4","Type":"ContainerDied","Data":"0009a524b9b82e8e3d21213b28a78520227e9ba17988a0f3fbb02000a6be9944"} Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.796155 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.799836 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5prwt_63c7d8f9-5c85-4999-b60b-517b03ff5992/ovs-vswitchd/0.log" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.801587 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5prwt" event={"ID":"63c7d8f9-5c85-4999-b60b-517b03ff5992","Type":"ContainerDied","Data":"3faa43e3ebe5b0a934ddcdf3553d1b39c5fd37531efe75285afb4fc1de61c554"} Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.801826 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5prwt" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.817674 4991 scope.go:117] "RemoveContainer" containerID="0d1610527cf8b6f50326a4d6ebe66a1e52c2dc1024a98e810b007cc8199eb0b7" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.831003 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.854074 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.872065 4991 scope.go:117] "RemoveContainer" containerID="eb28a1e65b323917d5e53d7d3619b4b0894ce6380fa661067a656f0faf1a3966" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.872767 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.879666 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.883804 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-5prwt"] Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.887534 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-5prwt"] Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.893027 4991 scope.go:117] "RemoveContainer" containerID="25950ee93c182d2a8f2b482674bcf125f0ce2007882775e9431029f2d5153184" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.915522 4991 scope.go:117] "RemoveContainer" containerID="662006c1a00d0cac716c8677f83ad79a7b88245c89d3c05d4a41987440c0babd" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.934709 4991 scope.go:117] "RemoveContainer" containerID="264ceea5be73f445fe8809bba7e4a58faeb85d87ce005ae2e2337c4fbd772807" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.951501 4991 scope.go:117] "RemoveContainer" containerID="8619a7be0d8b8d3e157358434fab68c5d39a5c107bae0e507da39b55321787f9" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.971105 4991 scope.go:117] "RemoveContainer" containerID="fac75ff26b47c3f0e62bea6d62aa82cb9e5265892c9bea171fe5b4d799545d4b" Oct 06 08:43:06 crc kubenswrapper[4991]: I1006 08:43:06.989764 4991 scope.go:117] "RemoveContainer" containerID="d9388ecf0c6db1afc9baa8762ef9460101639492f4059916a5452baf6ce1da9b" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.004203 4991 scope.go:117] "RemoveContainer" containerID="aebf96364238cb6b3d252db6049f87fc6c27dc0650a174ecda7b2742358b2979" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.019516 4991 scope.go:117] "RemoveContainer" containerID="cc510399cff86b9534906da4fd4dfb566ffc21c65dc3e7a29de4d1e16e9e7f7a" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.038032 4991 scope.go:117] "RemoveContainer" containerID="abaa2e04344e35bc84fdbd617310659cf3403a7924fe6ea867f216abcc6fa8c7" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.055135 4991 scope.go:117] "RemoveContainer" containerID="18a56e04769a024151f561f4820a607601164263d72a0ba3ba3c5a8eb7b72631" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.072799 4991 scope.go:117] "RemoveContainer" containerID="3b537ff709c1788e201f7be5c9872d032b3f628ae4187cee84bd9ddc9645c96c" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.089195 4991 scope.go:117] "RemoveContainer" containerID="cacc49468ee93ceabe894ccc8d50085a9655611b6c4501bf305bb67771d140e5" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.114635 4991 scope.go:117] "RemoveContainer" containerID="c9ef1fa176e4762e4800cf8c17d38583018327434b1f427f17c6368143ce1443" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.138900 4991 scope.go:117] "RemoveContainer" containerID="ed12c4a932f30894215eff330feb00b02897cadb829ca357ed1fd45e5afdf1b3" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.161787 4991 scope.go:117] "RemoveContainer" containerID="6995da8efae859a8428f75fedf8baa18bc43feab91f99aa1acb1c2111c76f159" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.189103 4991 scope.go:117] "RemoveContainer" containerID="2e7eb2582370554773ae98aed6757b4864dc6792c09d3a3d1a34f351287002b7" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.215513 4991 scope.go:117] "RemoveContainer" containerID="7361661faa0dd965eb9150f74e1354d3da89ded19ead38c6742a02c9d3302dbc" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.257399 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" path="/var/lib/kubelet/pods/14cb118a-286e-4ded-890d-fc788f9361f4/volumes" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.262810 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd2d34c-a29e-47b8-98b4-f75fffb11673" path="/var/lib/kubelet/pods/4dd2d34c-a29e-47b8-98b4-f75fffb11673/volumes" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.264836 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" path="/var/lib/kubelet/pods/63c7d8f9-5c85-4999-b60b-517b03ff5992/volumes" Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.824441 4991 generic.go:334] "Generic (PLEG): container finished" podID="b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" containerID="e4f6d0b19739abcb258214d109780b3ff64fc9ec2a7fcab21ee0b21a7146c8b0" exitCode=0 Oct 06 08:43:07 crc kubenswrapper[4991]: I1006 08:43:07.824529 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-592vz" event={"ID":"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c","Type":"ContainerDied","Data":"e4f6d0b19739abcb258214d109780b3ff64fc9ec2a7fcab21ee0b21a7146c8b0"} Oct 06 08:43:08 crc kubenswrapper[4991]: I1006 08:43:08.715221 4991 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode8e91b06-a3c1-41dc-b2f8-af738647ade8"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode8e91b06-a3c1-41dc-b2f8-af738647ade8] : Timed out while waiting for systemd to remove kubepods-besteffort-pode8e91b06_a3c1_41dc_b2f8_af738647ade8.slice" Oct 06 08:43:08 crc kubenswrapper[4991]: E1006 08:43:08.715670 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pode8e91b06-a3c1-41dc-b2f8-af738647ade8] : unable to destroy cgroup paths for cgroup [kubepods besteffort pode8e91b06-a3c1-41dc-b2f8-af738647ade8] : Timed out while waiting for systemd to remove kubepods-besteffort-pode8e91b06_a3c1_41dc_b2f8_af738647ade8.slice" pod="openstack/openstackclient" podUID="e8e91b06-a3c1-41dc-b2f8-af738647ade8" Oct 06 08:43:08 crc kubenswrapper[4991]: I1006 08:43:08.841130 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 08:43:08 crc kubenswrapper[4991]: I1006 08:43:08.841125 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-592vz" event={"ID":"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c","Type":"ContainerStarted","Data":"35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8"} Oct 06 08:43:08 crc kubenswrapper[4991]: I1006 08:43:08.867950 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-592vz" podStartSLOduration=2.218992017 podStartE2EDuration="4.86792915s" podCreationTimestamp="2025-10-06 08:43:04 +0000 UTC" firstStartedPulling="2025-10-06 08:43:05.762707314 +0000 UTC m=+1437.500457335" lastFinishedPulling="2025-10-06 08:43:08.411644407 +0000 UTC m=+1440.149394468" observedRunningTime="2025-10-06 08:43:08.863887063 +0000 UTC m=+1440.601637114" watchObservedRunningTime="2025-10-06 08:43:08.86792915 +0000 UTC m=+1440.605679181" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.689228 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi9279-account-delete-bsk7x" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.743652 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderb589-account-delete-h8q45" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.783768 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmvb7\" (UniqueName: \"kubernetes.io/projected/2b01de4c-42f4-4928-916a-6a9638340718-kube-api-access-fmvb7\") pod \"2b01de4c-42f4-4928-916a-6a9638340718\" (UID: \"2b01de4c-42f4-4928-916a-6a9638340718\") " Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.788768 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b01de4c-42f4-4928-916a-6a9638340718-kube-api-access-fmvb7" (OuterVolumeSpecName: "kube-api-access-fmvb7") pod "2b01de4c-42f4-4928-916a-6a9638340718" (UID: "2b01de4c-42f4-4928-916a-6a9638340718"). InnerVolumeSpecName "kube-api-access-fmvb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.860746 4991 generic.go:334] "Generic (PLEG): container finished" podID="7d3b515a-b48d-48f7-8775-a0299e07f231" containerID="156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b" exitCode=137 Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.860845 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderb589-account-delete-h8q45" event={"ID":"7d3b515a-b48d-48f7-8775-a0299e07f231","Type":"ContainerDied","Data":"156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b"} Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.860881 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinderb589-account-delete-h8q45" event={"ID":"7d3b515a-b48d-48f7-8775-a0299e07f231","Type":"ContainerDied","Data":"4c3a5e54b8632d36c596bec64e3dbf3296c1597a7040c4b8a896e908b5fae82a"} Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.860905 4991 scope.go:117] "RemoveContainer" containerID="156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.861050 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinderb589-account-delete-h8q45" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.863783 4991 generic.go:334] "Generic (PLEG): container finished" podID="2b01de4c-42f4-4928-916a-6a9638340718" containerID="5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085" exitCode=137 Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.863828 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi9279-account-delete-bsk7x" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.863840 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi9279-account-delete-bsk7x" event={"ID":"2b01de4c-42f4-4928-916a-6a9638340718","Type":"ContainerDied","Data":"5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085"} Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.863880 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi9279-account-delete-bsk7x" event={"ID":"2b01de4c-42f4-4928-916a-6a9638340718","Type":"ContainerDied","Data":"8b9fb576189ea0b5bcf5af20288a8d25ce827ec05eb093f6aed9ec5455262ba0"} Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.884874 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmzzc\" (UniqueName: \"kubernetes.io/projected/7d3b515a-b48d-48f7-8775-a0299e07f231-kube-api-access-hmzzc\") pod \"7d3b515a-b48d-48f7-8775-a0299e07f231\" (UID: \"7d3b515a-b48d-48f7-8775-a0299e07f231\") " Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.885212 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmvb7\" (UniqueName: \"kubernetes.io/projected/2b01de4c-42f4-4928-916a-6a9638340718-kube-api-access-fmvb7\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.888924 4991 scope.go:117] "RemoveContainer" containerID="156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b" Oct 06 08:43:10 crc kubenswrapper[4991]: E1006 08:43:10.889376 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b\": container with ID starting with 156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b not found: ID does not exist" containerID="156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.889420 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b"} err="failed to get container status \"156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b\": rpc error: code = NotFound desc = could not find container \"156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b\": container with ID starting with 156f9d93062f9e13e861e8da1ecf7added5e0c76123850ef962d2c1700929e4b not found: ID does not exist" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.889454 4991 scope.go:117] "RemoveContainer" containerID="5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.891401 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3b515a-b48d-48f7-8775-a0299e07f231-kube-api-access-hmzzc" (OuterVolumeSpecName: "kube-api-access-hmzzc") pod "7d3b515a-b48d-48f7-8775-a0299e07f231" (UID: "7d3b515a-b48d-48f7-8775-a0299e07f231"). InnerVolumeSpecName "kube-api-access-hmzzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.913714 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi9279-account-delete-bsk7x"] Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.916705 4991 scope.go:117] "RemoveContainer" containerID="5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085" Oct 06 08:43:10 crc kubenswrapper[4991]: E1006 08:43:10.917099 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085\": container with ID starting with 5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085 not found: ID does not exist" containerID="5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.917135 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085"} err="failed to get container status \"5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085\": rpc error: code = NotFound desc = could not find container \"5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085\": container with ID starting with 5464e898bd8359a929111778930467e03b419e3395e8322b0f20a23062311085 not found: ID does not exist" Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.922854 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi9279-account-delete-bsk7x"] Oct 06 08:43:10 crc kubenswrapper[4991]: I1006 08:43:10.986994 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmzzc\" (UniqueName: \"kubernetes.io/projected/7d3b515a-b48d-48f7-8775-a0299e07f231-kube-api-access-hmzzc\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:11 crc kubenswrapper[4991]: I1006 08:43:11.212482 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinderb589-account-delete-h8q45"] Oct 06 08:43:11 crc kubenswrapper[4991]: I1006 08:43:11.219135 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinderb589-account-delete-h8q45"] Oct 06 08:43:11 crc kubenswrapper[4991]: I1006 08:43:11.259947 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b01de4c-42f4-4928-916a-6a9638340718" path="/var/lib/kubelet/pods/2b01de4c-42f4-4928-916a-6a9638340718/volumes" Oct 06 08:43:11 crc kubenswrapper[4991]: I1006 08:43:11.260576 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d3b515a-b48d-48f7-8775-a0299e07f231" path="/var/lib/kubelet/pods/7d3b515a-b48d-48f7-8775-a0299e07f231/volumes" Oct 06 08:43:14 crc kubenswrapper[4991]: I1006 08:43:14.547973 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:14 crc kubenswrapper[4991]: I1006 08:43:14.548448 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:14 crc kubenswrapper[4991]: I1006 08:43:14.631982 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:15 crc kubenswrapper[4991]: I1006 08:43:15.012116 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:15 crc kubenswrapper[4991]: I1006 08:43:15.054272 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-592vz"] Oct 06 08:43:16 crc kubenswrapper[4991]: I1006 08:43:16.940629 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-592vz" podUID="b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" containerName="registry-server" containerID="cri-o://35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8" gracePeriod=2 Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.449238 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.593190 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-utilities\") pod \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\" (UID: \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\") " Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.593447 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn645\" (UniqueName: \"kubernetes.io/projected/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-kube-api-access-wn645\") pod \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\" (UID: \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\") " Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.593514 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-catalog-content\") pod \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\" (UID: \"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c\") " Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.594595 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-utilities" (OuterVolumeSpecName: "utilities") pod "b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" (UID: "b5eb2f7c-6c04-4736-a677-4fd1a7571c9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.604208 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-kube-api-access-wn645" (OuterVolumeSpecName: "kube-api-access-wn645") pod "b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" (UID: "b5eb2f7c-6c04-4736-a677-4fd1a7571c9c"). InnerVolumeSpecName "kube-api-access-wn645". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.695310 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn645\" (UniqueName: \"kubernetes.io/projected/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-kube-api-access-wn645\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.695339 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.958877 4991 generic.go:334] "Generic (PLEG): container finished" podID="b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" containerID="35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8" exitCode=0 Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.958920 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-592vz" event={"ID":"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c","Type":"ContainerDied","Data":"35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8"} Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.959060 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-592vz" event={"ID":"b5eb2f7c-6c04-4736-a677-4fd1a7571c9c","Type":"ContainerDied","Data":"3af16025e9df4c3a3099430d9140237743b7e1a097353d8c3c878c11efdc1bf0"} Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.959077 4991 scope.go:117] "RemoveContainer" containerID="35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8" Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.958965 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-592vz" Oct 06 08:43:17 crc kubenswrapper[4991]: I1006 08:43:17.983569 4991 scope.go:117] "RemoveContainer" containerID="e4f6d0b19739abcb258214d109780b3ff64fc9ec2a7fcab21ee0b21a7146c8b0" Oct 06 08:43:18 crc kubenswrapper[4991]: I1006 08:43:18.000451 4991 scope.go:117] "RemoveContainer" containerID="5740ccdbe011f4b729456154fb3e69eb559c4d332b03e6945b189471bd34be4f" Oct 06 08:43:18 crc kubenswrapper[4991]: I1006 08:43:18.052319 4991 scope.go:117] "RemoveContainer" containerID="35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8" Oct 06 08:43:18 crc kubenswrapper[4991]: E1006 08:43:18.052722 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8\": container with ID starting with 35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8 not found: ID does not exist" containerID="35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8" Oct 06 08:43:18 crc kubenswrapper[4991]: I1006 08:43:18.052752 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8"} err="failed to get container status \"35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8\": rpc error: code = NotFound desc = could not find container \"35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8\": container with ID starting with 35cba37db9dea4cb920052d7cc1301a56b915fe325ce95fc1909c3f32b3f61c8 not found: ID does not exist" Oct 06 08:43:18 crc kubenswrapper[4991]: I1006 08:43:18.052780 4991 scope.go:117] "RemoveContainer" containerID="e4f6d0b19739abcb258214d109780b3ff64fc9ec2a7fcab21ee0b21a7146c8b0" Oct 06 08:43:18 crc kubenswrapper[4991]: E1006 08:43:18.053179 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4f6d0b19739abcb258214d109780b3ff64fc9ec2a7fcab21ee0b21a7146c8b0\": container with ID starting with e4f6d0b19739abcb258214d109780b3ff64fc9ec2a7fcab21ee0b21a7146c8b0 not found: ID does not exist" containerID="e4f6d0b19739abcb258214d109780b3ff64fc9ec2a7fcab21ee0b21a7146c8b0" Oct 06 08:43:18 crc kubenswrapper[4991]: I1006 08:43:18.053205 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f6d0b19739abcb258214d109780b3ff64fc9ec2a7fcab21ee0b21a7146c8b0"} err="failed to get container status \"e4f6d0b19739abcb258214d109780b3ff64fc9ec2a7fcab21ee0b21a7146c8b0\": rpc error: code = NotFound desc = could not find container \"e4f6d0b19739abcb258214d109780b3ff64fc9ec2a7fcab21ee0b21a7146c8b0\": container with ID starting with e4f6d0b19739abcb258214d109780b3ff64fc9ec2a7fcab21ee0b21a7146c8b0 not found: ID does not exist" Oct 06 08:43:18 crc kubenswrapper[4991]: I1006 08:43:18.053217 4991 scope.go:117] "RemoveContainer" containerID="5740ccdbe011f4b729456154fb3e69eb559c4d332b03e6945b189471bd34be4f" Oct 06 08:43:18 crc kubenswrapper[4991]: E1006 08:43:18.053747 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5740ccdbe011f4b729456154fb3e69eb559c4d332b03e6945b189471bd34be4f\": container with ID starting with 5740ccdbe011f4b729456154fb3e69eb559c4d332b03e6945b189471bd34be4f not found: ID does not exist" containerID="5740ccdbe011f4b729456154fb3e69eb559c4d332b03e6945b189471bd34be4f" Oct 06 08:43:18 crc kubenswrapper[4991]: I1006 08:43:18.053792 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5740ccdbe011f4b729456154fb3e69eb559c4d332b03e6945b189471bd34be4f"} err="failed to get container status \"5740ccdbe011f4b729456154fb3e69eb559c4d332b03e6945b189471bd34be4f\": rpc error: code = NotFound desc = could not find container \"5740ccdbe011f4b729456154fb3e69eb559c4d332b03e6945b189471bd34be4f\": container with ID starting with 5740ccdbe011f4b729456154fb3e69eb559c4d332b03e6945b189471bd34be4f not found: ID does not exist" Oct 06 08:43:18 crc kubenswrapper[4991]: I1006 08:43:18.368581 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" (UID: "b5eb2f7c-6c04-4736-a677-4fd1a7571c9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:43:18 crc kubenswrapper[4991]: I1006 08:43:18.406638 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:43:18 crc kubenswrapper[4991]: I1006 08:43:18.608917 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-592vz"] Oct 06 08:43:18 crc kubenswrapper[4991]: I1006 08:43:18.615338 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-592vz"] Oct 06 08:43:19 crc kubenswrapper[4991]: I1006 08:43:19.277071 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" path="/var/lib/kubelet/pods/b5eb2f7c-6c04-4736-a677-4fd1a7571c9c/volumes" Oct 06 08:43:27 crc kubenswrapper[4991]: I1006 08:43:27.530013 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:43:27 crc kubenswrapper[4991]: I1006 08:43:27.530824 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:43:57 crc kubenswrapper[4991]: I1006 08:43:57.529706 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:43:57 crc kubenswrapper[4991]: I1006 08:43:57.530383 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:43:57 crc kubenswrapper[4991]: I1006 08:43:57.530446 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:43:57 crc kubenswrapper[4991]: I1006 08:43:57.531402 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36"} pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:43:57 crc kubenswrapper[4991]: I1006 08:43:57.531489 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" containerID="cri-o://e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" gracePeriod=600 Oct 06 08:43:57 crc kubenswrapper[4991]: E1006 08:43:57.671871 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:43:58 crc kubenswrapper[4991]: I1006 08:43:58.425545 4991 generic.go:334] "Generic (PLEG): container finished" podID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" exitCode=0 Oct 06 08:43:58 crc kubenswrapper[4991]: I1006 08:43:58.425620 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerDied","Data":"e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36"} Oct 06 08:43:58 crc kubenswrapper[4991]: I1006 08:43:58.425681 4991 scope.go:117] "RemoveContainer" containerID="588bca8d19a8065db7c6c040db1c1694b8c7daffc697ab9a2f8788b4b3c06abd" Oct 06 08:43:58 crc kubenswrapper[4991]: I1006 08:43:58.426475 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:43:58 crc kubenswrapper[4991]: E1006 08:43:58.427075 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:44:12 crc kubenswrapper[4991]: I1006 08:44:12.244019 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:44:12 crc kubenswrapper[4991]: E1006 08:44:12.245001 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:44:25 crc kubenswrapper[4991]: I1006 08:44:25.243888 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:44:25 crc kubenswrapper[4991]: E1006 08:44:25.244681 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:44:28 crc kubenswrapper[4991]: I1006 08:44:28.403282 4991 scope.go:117] "RemoveContainer" containerID="da83e83e66021684e9dadbeb740e9d7bfc886bf1d746ab2487aacf93a55a5577" Oct 06 08:44:28 crc kubenswrapper[4991]: I1006 08:44:28.427028 4991 scope.go:117] "RemoveContainer" containerID="ca13f7ecc36df43a2b7566361dbd728e4e7c12b4be68e74f14a5ac1f6960d766" Oct 06 08:44:28 crc kubenswrapper[4991]: I1006 08:44:28.466024 4991 scope.go:117] "RemoveContainer" containerID="5d0ad4dbaf7672bd759ce4b1d4274210a70349250211f5e7c2592266f9db5df1" Oct 06 08:44:28 crc kubenswrapper[4991]: I1006 08:44:28.494799 4991 scope.go:117] "RemoveContainer" containerID="bfc485566a236f6ef73e7c095c103b023b4a78c4c9b57e8035394ff2a4ca0c8a" Oct 06 08:44:28 crc kubenswrapper[4991]: I1006 08:44:28.527199 4991 scope.go:117] "RemoveContainer" containerID="8c2f621a06879a2c3c612a06ac045c0a16c5be885f62705d7fbfffbef118ca1e" Oct 06 08:44:28 crc kubenswrapper[4991]: I1006 08:44:28.559100 4991 scope.go:117] "RemoveContainer" containerID="297d9cbbd1aa92c82c12d9afb0a240a23ded5b725c38bc8c336fbecf56a3f52f" Oct 06 08:44:28 crc kubenswrapper[4991]: I1006 08:44:28.582576 4991 scope.go:117] "RemoveContainer" containerID="5b3441530c80e9311844d2033726d1a48cc481c7a9c1cf589a46dacfd173501f" Oct 06 08:44:28 crc kubenswrapper[4991]: I1006 08:44:28.608546 4991 scope.go:117] "RemoveContainer" containerID="94c589983290634c76235daa1990cab452138af9c99951302ddc413d46fc20a4" Oct 06 08:44:28 crc kubenswrapper[4991]: I1006 08:44:28.638067 4991 scope.go:117] "RemoveContainer" containerID="367eeb397b00a7696d851f72cefdac0146f8753511bf7f8e96400955a3dea1fd" Oct 06 08:44:28 crc kubenswrapper[4991]: I1006 08:44:28.661792 4991 scope.go:117] "RemoveContainer" containerID="d2a68d324a56519f7999d2aa245e7844322a98233c289cf351e1037336ccc2f5" Oct 06 08:44:36 crc kubenswrapper[4991]: I1006 08:44:36.244042 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:44:36 crc kubenswrapper[4991]: E1006 08:44:36.244847 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:44:50 crc kubenswrapper[4991]: I1006 08:44:50.244268 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:44:50 crc kubenswrapper[4991]: E1006 08:44:50.245155 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.151951 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv"] Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152713 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-auditor" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152729 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-auditor" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152745 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-expirer" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152753 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-expirer" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152768 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd2d34c-a29e-47b8-98b4-f75fffb11673" containerName="probe" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152777 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd2d34c-a29e-47b8-98b4-f75fffb11673" containerName="probe" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152791 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152799 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152814 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-auditor" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152822 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-auditor" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152834 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-replicator" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152842 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-replicator" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152855 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" containerName="extract-utilities" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152864 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" containerName="extract-utilities" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152880 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-updater" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152888 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-updater" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152903 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="rsync" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152911 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="rsync" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152928 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-server" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152935 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-server" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152948 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-server" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152956 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-server" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152964 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-auditor" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152972 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-auditor" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.152986 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-updater" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.152994 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-updater" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.153004 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-replicator" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153012 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-replicator" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.153022 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server-init" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153030 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server-init" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.153043 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovs-vswitchd" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153050 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovs-vswitchd" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.153059 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="swift-recon-cron" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153068 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="swift-recon-cron" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.153079 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-replicator" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153106 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-replicator" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.153118 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd2d34c-a29e-47b8-98b4-f75fffb11673" containerName="cinder-scheduler" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153126 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd2d34c-a29e-47b8-98b4-f75fffb11673" containerName="cinder-scheduler" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.153138 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3b515a-b48d-48f7-8775-a0299e07f231" containerName="mariadb-account-delete" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153147 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3b515a-b48d-48f7-8775-a0299e07f231" containerName="mariadb-account-delete" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.153162 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" containerName="extract-content" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153172 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" containerName="extract-content" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.153188 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-server" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153197 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-server" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.153211 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" containerName="registry-server" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153219 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" containerName="registry-server" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.153235 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b01de4c-42f4-4928-916a-6a9638340718" containerName="mariadb-account-delete" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153243 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b01de4c-42f4-4928-916a-6a9638340718" containerName="mariadb-account-delete" Oct 06 08:45:00 crc kubenswrapper[4991]: E1006 08:45:00.153254 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-reaper" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153262 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-reaper" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153769 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-updater" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153789 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-auditor" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153802 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-auditor" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153815 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovsdb-server" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153829 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd2d34c-a29e-47b8-98b4-f75fffb11673" containerName="probe" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153840 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-replicator" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153856 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-reaper" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153871 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-updater" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153884 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-server" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153896 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="account-server" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153907 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="rsync" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153923 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3b515a-b48d-48f7-8775-a0299e07f231" containerName="mariadb-account-delete" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153938 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-auditor" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153949 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd2d34c-a29e-47b8-98b4-f75fffb11673" containerName="cinder-scheduler" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153960 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5eb2f7c-6c04-4736-a677-4fd1a7571c9c" containerName="registry-server" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153970 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-replicator" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153983 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-replicator" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.153995 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="swift-recon-cron" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.154009 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="object-expirer" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.154019 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b01de4c-42f4-4928-916a-6a9638340718" containerName="mariadb-account-delete" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.154030 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb118a-286e-4ded-890d-fc788f9361f4" containerName="container-server" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.154044 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c7d8f9-5c85-4999-b60b-517b03ff5992" containerName="ovs-vswitchd" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.154602 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.157102 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.157699 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.187713 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv"] Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.268953 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7pvl\" (UniqueName: \"kubernetes.io/projected/46b43eb1-531c-4afb-8c78-8463e34388cc-kube-api-access-m7pvl\") pod \"collect-profiles-29329005-zldnv\" (UID: \"46b43eb1-531c-4afb-8c78-8463e34388cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.269026 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46b43eb1-531c-4afb-8c78-8463e34388cc-secret-volume\") pod \"collect-profiles-29329005-zldnv\" (UID: \"46b43eb1-531c-4afb-8c78-8463e34388cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.269209 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46b43eb1-531c-4afb-8c78-8463e34388cc-config-volume\") pod \"collect-profiles-29329005-zldnv\" (UID: \"46b43eb1-531c-4afb-8c78-8463e34388cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.370942 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7pvl\" (UniqueName: \"kubernetes.io/projected/46b43eb1-531c-4afb-8c78-8463e34388cc-kube-api-access-m7pvl\") pod \"collect-profiles-29329005-zldnv\" (UID: \"46b43eb1-531c-4afb-8c78-8463e34388cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.371016 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46b43eb1-531c-4afb-8c78-8463e34388cc-secret-volume\") pod \"collect-profiles-29329005-zldnv\" (UID: \"46b43eb1-531c-4afb-8c78-8463e34388cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.371086 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46b43eb1-531c-4afb-8c78-8463e34388cc-config-volume\") pod \"collect-profiles-29329005-zldnv\" (UID: \"46b43eb1-531c-4afb-8c78-8463e34388cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.372928 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46b43eb1-531c-4afb-8c78-8463e34388cc-config-volume\") pod \"collect-profiles-29329005-zldnv\" (UID: \"46b43eb1-531c-4afb-8c78-8463e34388cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.381880 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46b43eb1-531c-4afb-8c78-8463e34388cc-secret-volume\") pod \"collect-profiles-29329005-zldnv\" (UID: \"46b43eb1-531c-4afb-8c78-8463e34388cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.398413 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7pvl\" (UniqueName: \"kubernetes.io/projected/46b43eb1-531c-4afb-8c78-8463e34388cc-kube-api-access-m7pvl\") pod \"collect-profiles-29329005-zldnv\" (UID: \"46b43eb1-531c-4afb-8c78-8463e34388cc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.487889 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:00 crc kubenswrapper[4991]: I1006 08:45:00.934079 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv"] Oct 06 08:45:00 crc kubenswrapper[4991]: W1006 08:45:00.936756 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46b43eb1_531c_4afb_8c78_8463e34388cc.slice/crio-f81acfe601084745da435ea0bea2178bf3bbfc1da97a3a5b1d5b37c5c861b0c1 WatchSource:0}: Error finding container f81acfe601084745da435ea0bea2178bf3bbfc1da97a3a5b1d5b37c5c861b0c1: Status 404 returned error can't find the container with id f81acfe601084745da435ea0bea2178bf3bbfc1da97a3a5b1d5b37c5c861b0c1 Oct 06 08:45:01 crc kubenswrapper[4991]: I1006 08:45:01.094827 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" event={"ID":"46b43eb1-531c-4afb-8c78-8463e34388cc","Type":"ContainerStarted","Data":"1da9174f3daeb4bd44832749dd216139f100ae018806ce57e39e0dc34205da54"} Oct 06 08:45:01 crc kubenswrapper[4991]: I1006 08:45:01.094879 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" event={"ID":"46b43eb1-531c-4afb-8c78-8463e34388cc","Type":"ContainerStarted","Data":"f81acfe601084745da435ea0bea2178bf3bbfc1da97a3a5b1d5b37c5c861b0c1"} Oct 06 08:45:01 crc kubenswrapper[4991]: I1006 08:45:01.114486 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" podStartSLOduration=1.114463174 podStartE2EDuration="1.114463174s" podCreationTimestamp="2025-10-06 08:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 08:45:01.108282132 +0000 UTC m=+1552.846032163" watchObservedRunningTime="2025-10-06 08:45:01.114463174 +0000 UTC m=+1552.852213205" Oct 06 08:45:02 crc kubenswrapper[4991]: I1006 08:45:02.107657 4991 generic.go:334] "Generic (PLEG): container finished" podID="46b43eb1-531c-4afb-8c78-8463e34388cc" containerID="1da9174f3daeb4bd44832749dd216139f100ae018806ce57e39e0dc34205da54" exitCode=0 Oct 06 08:45:02 crc kubenswrapper[4991]: I1006 08:45:02.108035 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" event={"ID":"46b43eb1-531c-4afb-8c78-8463e34388cc","Type":"ContainerDied","Data":"1da9174f3daeb4bd44832749dd216139f100ae018806ce57e39e0dc34205da54"} Oct 06 08:45:03 crc kubenswrapper[4991]: I1006 08:45:03.470639 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:03 crc kubenswrapper[4991]: I1006 08:45:03.623763 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7pvl\" (UniqueName: \"kubernetes.io/projected/46b43eb1-531c-4afb-8c78-8463e34388cc-kube-api-access-m7pvl\") pod \"46b43eb1-531c-4afb-8c78-8463e34388cc\" (UID: \"46b43eb1-531c-4afb-8c78-8463e34388cc\") " Oct 06 08:45:03 crc kubenswrapper[4991]: I1006 08:45:03.623901 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46b43eb1-531c-4afb-8c78-8463e34388cc-config-volume\") pod \"46b43eb1-531c-4afb-8c78-8463e34388cc\" (UID: \"46b43eb1-531c-4afb-8c78-8463e34388cc\") " Oct 06 08:45:03 crc kubenswrapper[4991]: I1006 08:45:03.623974 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46b43eb1-531c-4afb-8c78-8463e34388cc-secret-volume\") pod \"46b43eb1-531c-4afb-8c78-8463e34388cc\" (UID: \"46b43eb1-531c-4afb-8c78-8463e34388cc\") " Oct 06 08:45:03 crc kubenswrapper[4991]: I1006 08:45:03.625666 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46b43eb1-531c-4afb-8c78-8463e34388cc-config-volume" (OuterVolumeSpecName: "config-volume") pod "46b43eb1-531c-4afb-8c78-8463e34388cc" (UID: "46b43eb1-531c-4afb-8c78-8463e34388cc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 08:45:03 crc kubenswrapper[4991]: I1006 08:45:03.630260 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b43eb1-531c-4afb-8c78-8463e34388cc-kube-api-access-m7pvl" (OuterVolumeSpecName: "kube-api-access-m7pvl") pod "46b43eb1-531c-4afb-8c78-8463e34388cc" (UID: "46b43eb1-531c-4afb-8c78-8463e34388cc"). InnerVolumeSpecName "kube-api-access-m7pvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:45:03 crc kubenswrapper[4991]: I1006 08:45:03.630827 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46b43eb1-531c-4afb-8c78-8463e34388cc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "46b43eb1-531c-4afb-8c78-8463e34388cc" (UID: "46b43eb1-531c-4afb-8c78-8463e34388cc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 08:45:03 crc kubenswrapper[4991]: I1006 08:45:03.726839 4991 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46b43eb1-531c-4afb-8c78-8463e34388cc-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:03 crc kubenswrapper[4991]: I1006 08:45:03.726906 4991 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46b43eb1-531c-4afb-8c78-8463e34388cc-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:03 crc kubenswrapper[4991]: I1006 08:45:03.726936 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7pvl\" (UniqueName: \"kubernetes.io/projected/46b43eb1-531c-4afb-8c78-8463e34388cc-kube-api-access-m7pvl\") on node \"crc\" DevicePath \"\"" Oct 06 08:45:04 crc kubenswrapper[4991]: I1006 08:45:04.131607 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" event={"ID":"46b43eb1-531c-4afb-8c78-8463e34388cc","Type":"ContainerDied","Data":"f81acfe601084745da435ea0bea2178bf3bbfc1da97a3a5b1d5b37c5c861b0c1"} Oct 06 08:45:04 crc kubenswrapper[4991]: I1006 08:45:04.131666 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f81acfe601084745da435ea0bea2178bf3bbfc1da97a3a5b1d5b37c5c861b0c1" Oct 06 08:45:04 crc kubenswrapper[4991]: I1006 08:45:04.131736 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329005-zldnv" Oct 06 08:45:05 crc kubenswrapper[4991]: I1006 08:45:05.243550 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:45:05 crc kubenswrapper[4991]: E1006 08:45:05.243781 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:45:17 crc kubenswrapper[4991]: I1006 08:45:17.244142 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:45:17 crc kubenswrapper[4991]: E1006 08:45:17.245641 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:45:28 crc kubenswrapper[4991]: I1006 08:45:28.243943 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:45:28 crc kubenswrapper[4991]: E1006 08:45:28.244669 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:45:28 crc kubenswrapper[4991]: I1006 08:45:28.848863 4991 scope.go:117] "RemoveContainer" containerID="fb10a7813dbd9816182db52dd9a3b501c4c766c110a2193adb6f6007214cdc4f" Oct 06 08:45:28 crc kubenswrapper[4991]: I1006 08:45:28.887801 4991 scope.go:117] "RemoveContainer" containerID="93bde15692d80421b438da8a25bddff9e4c4228214885a56f99ff4f63cea895f" Oct 06 08:45:28 crc kubenswrapper[4991]: I1006 08:45:28.937113 4991 scope.go:117] "RemoveContainer" containerID="80d2f6a1ef6afbd1ba9965b2005ac33f9bb76351dfedd91967c680c8672c4df2" Oct 06 08:45:28 crc kubenswrapper[4991]: I1006 08:45:28.996101 4991 scope.go:117] "RemoveContainer" containerID="4b198d47b63faa11ab3269678b4d8f8709c7776bf98c9f43b32df455e70fc098" Oct 06 08:45:29 crc kubenswrapper[4991]: I1006 08:45:29.027014 4991 scope.go:117] "RemoveContainer" containerID="3664b0b86ced009ed293faa237b64fa88b76a10da99303a55d6b375dde2bab1c" Oct 06 08:45:29 crc kubenswrapper[4991]: I1006 08:45:29.071116 4991 scope.go:117] "RemoveContainer" containerID="0d0a7b7be490409ea510848f2bbc97f380e6856575ff17ed7973b25771f88cfc" Oct 06 08:45:29 crc kubenswrapper[4991]: I1006 08:45:29.101682 4991 scope.go:117] "RemoveContainer" containerID="e7be98d10e1a8ee5623408c860107ad0dafba366dc2800ccdd3c919ef2c6078a" Oct 06 08:45:29 crc kubenswrapper[4991]: I1006 08:45:29.150604 4991 scope.go:117] "RemoveContainer" containerID="1cbd110e01dc7118014251c8877f2413d8ad43399e486f3327dd5f1ac11596d2" Oct 06 08:45:29 crc kubenswrapper[4991]: I1006 08:45:29.177463 4991 scope.go:117] "RemoveContainer" containerID="5ad03ef6a51021ee9836ed0f5b8afab91ba8065dfaba83a0ae1d7aef99eba78b" Oct 06 08:45:29 crc kubenswrapper[4991]: I1006 08:45:29.208926 4991 scope.go:117] "RemoveContainer" containerID="f39f809086eb2bd23e1a09e4b6df9642065b898b5270ec07f191210540900b73" Oct 06 08:45:29 crc kubenswrapper[4991]: I1006 08:45:29.233696 4991 scope.go:117] "RemoveContainer" containerID="c8554f9b2917b9400926d1608cbf5f4f2c2d666a21fe6417cd6a5eadb0c003c4" Oct 06 08:45:29 crc kubenswrapper[4991]: I1006 08:45:29.296088 4991 scope.go:117] "RemoveContainer" containerID="40cc2581ab3ca423c98e61d01fbf933e125eced21752dcff956d71eaf1890135" Oct 06 08:45:29 crc kubenswrapper[4991]: I1006 08:45:29.316533 4991 scope.go:117] "RemoveContainer" containerID="9d1e65fa883ba5cffd5e95aa22ba7e682849355dc55829a97f9c2ae259c034c9" Oct 06 08:45:29 crc kubenswrapper[4991]: I1006 08:45:29.332794 4991 scope.go:117] "RemoveContainer" containerID="855e698b4a89b7f90fca1d65066f42cf770d32b0ec2573bc09f0d5dcbde6d2e3" Oct 06 08:45:29 crc kubenswrapper[4991]: I1006 08:45:29.355656 4991 scope.go:117] "RemoveContainer" containerID="230ecef54a8d71e96735b7f37e4c28c5d6b96546bba37975613dfcde0832897f" Oct 06 08:45:41 crc kubenswrapper[4991]: I1006 08:45:41.243793 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:45:41 crc kubenswrapper[4991]: E1006 08:45:41.244975 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:45:55 crc kubenswrapper[4991]: I1006 08:45:55.244587 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:45:55 crc kubenswrapper[4991]: E1006 08:45:55.245800 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:46:10 crc kubenswrapper[4991]: I1006 08:46:10.243846 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:46:10 crc kubenswrapper[4991]: E1006 08:46:10.244714 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:46:24 crc kubenswrapper[4991]: I1006 08:46:24.243888 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:46:24 crc kubenswrapper[4991]: E1006 08:46:24.244448 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:46:29 crc kubenswrapper[4991]: I1006 08:46:29.646173 4991 scope.go:117] "RemoveContainer" containerID="b418bee21a0d9a97cc7e25f384e4781bb8a1da9561b86a4b569400fcf968b49c" Oct 06 08:46:29 crc kubenswrapper[4991]: I1006 08:46:29.679801 4991 scope.go:117] "RemoveContainer" containerID="f83ef24bc48b9f3df4545258f80864d16d673fc45ac88575e0e485addba7df62" Oct 06 08:46:29 crc kubenswrapper[4991]: I1006 08:46:29.723072 4991 scope.go:117] "RemoveContainer" containerID="2f29341e126502f19b2fe665eb6f63634e44f634ecd075749718883b8f004d5b" Oct 06 08:46:29 crc kubenswrapper[4991]: I1006 08:46:29.750001 4991 scope.go:117] "RemoveContainer" containerID="4e08aae5f1f3064fd06a75855d7641f5f9a9574da5cd200704d0371193acd2b3" Oct 06 08:46:29 crc kubenswrapper[4991]: I1006 08:46:29.778354 4991 scope.go:117] "RemoveContainer" containerID="448d92f50c90e335047076f51466baf5a63aec01c30cb258c80a14dc8b42453a" Oct 06 08:46:29 crc kubenswrapper[4991]: I1006 08:46:29.810196 4991 scope.go:117] "RemoveContainer" containerID="7b234e61f6c430aa76d75399799f1ef37b2b243dd9d9dd8ad0e6b5d63e0347e7" Oct 06 08:46:29 crc kubenswrapper[4991]: I1006 08:46:29.844376 4991 scope.go:117] "RemoveContainer" containerID="47090eb349924642f543c04a66d3390950a21742c182060091cfbb40d99efe76" Oct 06 08:46:29 crc kubenswrapper[4991]: I1006 08:46:29.886951 4991 scope.go:117] "RemoveContainer" containerID="9f7dc8083673fb521af061c8df5ca04354332444376a940728267b2a54832c2d" Oct 06 08:46:29 crc kubenswrapper[4991]: I1006 08:46:29.915549 4991 scope.go:117] "RemoveContainer" containerID="5495f389c5daa6ac8b781c1f5e42e30df351043da6778baf343002e37f3cee49" Oct 06 08:46:29 crc kubenswrapper[4991]: I1006 08:46:29.945524 4991 scope.go:117] "RemoveContainer" containerID="4fbfc2abb485c8ccd9560493a1360ee31985544c8877ac7b1baa4f76139308c7" Oct 06 08:46:29 crc kubenswrapper[4991]: I1006 08:46:29.987879 4991 scope.go:117] "RemoveContainer" containerID="bdf0cebdfc6bfe885875c71707250ed3c4a35ce750f74c8d41fb559482de14ee" Oct 06 08:46:30 crc kubenswrapper[4991]: I1006 08:46:30.038279 4991 scope.go:117] "RemoveContainer" containerID="455474fa51c249bc946c92e04261ca0b1c51eb0b6f215aa5fd14c0a8eb825d65" Oct 06 08:46:30 crc kubenswrapper[4991]: I1006 08:46:30.085093 4991 scope.go:117] "RemoveContainer" containerID="ed3d4866db94527f98aa6062572670cd20f71dc34b4e9fe3ca2ccfae1b03bda2" Oct 06 08:46:30 crc kubenswrapper[4991]: I1006 08:46:30.110377 4991 scope.go:117] "RemoveContainer" containerID="0c70b35b1a4450b4db02a166e4cb0db2437a7fcc554b453e7d86b3f8efc7685d" Oct 06 08:46:30 crc kubenswrapper[4991]: I1006 08:46:30.138932 4991 scope.go:117] "RemoveContainer" containerID="4adcd03b16369123fe98ce7c851a353189075e3043ebaad574d8063e0846582d" Oct 06 08:46:30 crc kubenswrapper[4991]: I1006 08:46:30.157583 4991 scope.go:117] "RemoveContainer" containerID="bb32cfe19b795b45e56a5fe6cf63e74c7d601ad82e46c5e7cffe99ccb8d6d994" Oct 06 08:46:35 crc kubenswrapper[4991]: I1006 08:46:35.243763 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:46:35 crc kubenswrapper[4991]: E1006 08:46:35.244572 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:46:48 crc kubenswrapper[4991]: I1006 08:46:48.243866 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:46:48 crc kubenswrapper[4991]: E1006 08:46:48.244604 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:47:01 crc kubenswrapper[4991]: I1006 08:47:01.243612 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:47:01 crc kubenswrapper[4991]: E1006 08:47:01.246433 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:47:16 crc kubenswrapper[4991]: I1006 08:47:16.244485 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:47:16 crc kubenswrapper[4991]: E1006 08:47:16.245736 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:47:28 crc kubenswrapper[4991]: I1006 08:47:28.243632 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:47:28 crc kubenswrapper[4991]: E1006 08:47:28.244389 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:47:30 crc kubenswrapper[4991]: I1006 08:47:30.406054 4991 scope.go:117] "RemoveContainer" containerID="5b868b9187832c2005be815975b42773e519e81fb95f561cfb8a51e94477be13" Oct 06 08:47:30 crc kubenswrapper[4991]: I1006 08:47:30.439599 4991 scope.go:117] "RemoveContainer" containerID="c3b1614500005292c9e7b6920ac4a7cc87e019fd8e824585e552366b6101a5ab" Oct 06 08:47:30 crc kubenswrapper[4991]: I1006 08:47:30.466504 4991 scope.go:117] "RemoveContainer" containerID="afaad24da82e9977eb0954a81eb93a35ce855f528655c94b4ae6d47f4f212c3d" Oct 06 08:47:30 crc kubenswrapper[4991]: I1006 08:47:30.496210 4991 scope.go:117] "RemoveContainer" containerID="05fd087fc4e56815232a45eddb3364d72ba9e9e329ba6d624cee180ef68e0693" Oct 06 08:47:30 crc kubenswrapper[4991]: I1006 08:47:30.528206 4991 scope.go:117] "RemoveContainer" containerID="62e47d84171c7222528d366552dba6e76ba86c5bd424f4e5ce7c51dc4772d323" Oct 06 08:47:30 crc kubenswrapper[4991]: I1006 08:47:30.557276 4991 scope.go:117] "RemoveContainer" containerID="cbef79c571778565a610536c8feaba7e67321af73c156f93f621ec4d91f65fe9" Oct 06 08:47:30 crc kubenswrapper[4991]: I1006 08:47:30.613079 4991 scope.go:117] "RemoveContainer" containerID="8d08c1c46e9dbb33c0fcf110f50fb8a57ea2588d6e3f2a5e95349068fb7c093c" Oct 06 08:47:30 crc kubenswrapper[4991]: I1006 08:47:30.651700 4991 scope.go:117] "RemoveContainer" containerID="1428ac64bc5e21255061577f68b26d4466a1e854d5bb4746502e41b12d412d03" Oct 06 08:47:30 crc kubenswrapper[4991]: I1006 08:47:30.702208 4991 scope.go:117] "RemoveContainer" containerID="90cd40de1bdce0b9010647126aca623edc030c70b8cf3e25c080af7f4f0d06b5" Oct 06 08:47:30 crc kubenswrapper[4991]: I1006 08:47:30.789754 4991 scope.go:117] "RemoveContainer" containerID="2b0d0844a91b8badda9de344d7ec23d9d43da43f008821a4ea8b7ae982ffc991" Oct 06 08:47:43 crc kubenswrapper[4991]: I1006 08:47:43.244621 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:47:43 crc kubenswrapper[4991]: E1006 08:47:43.245407 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:47:57 crc kubenswrapper[4991]: I1006 08:47:57.244619 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:47:57 crc kubenswrapper[4991]: E1006 08:47:57.246969 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:48:08 crc kubenswrapper[4991]: I1006 08:48:08.244588 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:48:08 crc kubenswrapper[4991]: E1006 08:48:08.245574 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:48:21 crc kubenswrapper[4991]: I1006 08:48:21.245078 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:48:21 crc kubenswrapper[4991]: E1006 08:48:21.246530 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:48:30 crc kubenswrapper[4991]: I1006 08:48:30.987999 4991 scope.go:117] "RemoveContainer" containerID="4c49ecde03a108088eaff49d978ace50e6654f0b6205db59fddf267b0df1faab" Oct 06 08:48:31 crc kubenswrapper[4991]: I1006 08:48:31.023002 4991 scope.go:117] "RemoveContainer" containerID="98d4fcdfdc9774dff4624bf92e206f1e36780461435c0e70b7a79655aa1bd813" Oct 06 08:48:31 crc kubenswrapper[4991]: I1006 08:48:31.059463 4991 scope.go:117] "RemoveContainer" containerID="6a0dd291d385b5c827db71bd9cfc93863a8619475a3e50f8d7d1d405394842f9" Oct 06 08:48:31 crc kubenswrapper[4991]: I1006 08:48:31.089150 4991 scope.go:117] "RemoveContainer" containerID="832edd5d33c524ced05fce73559b98b910c69bcaa4f037231d4db46add5712d9" Oct 06 08:48:31 crc kubenswrapper[4991]: I1006 08:48:31.109556 4991 scope.go:117] "RemoveContainer" containerID="d7106d512c69044297389f1917afcf12abd789ac37bc7313f94e87abdc2dd932" Oct 06 08:48:32 crc kubenswrapper[4991]: I1006 08:48:32.243907 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:48:32 crc kubenswrapper[4991]: E1006 08:48:32.244528 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:48:47 crc kubenswrapper[4991]: I1006 08:48:47.244038 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:48:47 crc kubenswrapper[4991]: E1006 08:48:47.246097 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:49:01 crc kubenswrapper[4991]: I1006 08:49:01.244056 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:49:01 crc kubenswrapper[4991]: I1006 08:49:01.693419 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"90b98db4325250f49a805f3086b07d67be3d9b4c9b074a9983cddc7c950dae26"} Oct 06 08:49:31 crc kubenswrapper[4991]: I1006 08:49:31.225403 4991 scope.go:117] "RemoveContainer" containerID="87310a359b34691e77d2310cf9bb176e0eebac4a7cd386913eb46acf7f2817cf" Oct 06 08:49:31 crc kubenswrapper[4991]: I1006 08:49:31.254517 4991 scope.go:117] "RemoveContainer" containerID="cae646f42382972d5beded15a444697216db639c9bec708007612049ee7f8e6f" Oct 06 08:49:31 crc kubenswrapper[4991]: I1006 08:49:31.284070 4991 scope.go:117] "RemoveContainer" containerID="8e9dca7e656636d8cc5ec73d30b74365cb11a8a92789fecf7545e4cc91f5646c" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.795034 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4f8w6"] Oct 06 08:51:22 crc kubenswrapper[4991]: E1006 08:51:22.797332 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b43eb1-531c-4afb-8c78-8463e34388cc" containerName="collect-profiles" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.797452 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b43eb1-531c-4afb-8c78-8463e34388cc" containerName="collect-profiles" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.797737 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="46b43eb1-531c-4afb-8c78-8463e34388cc" containerName="collect-profiles" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.799106 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.807129 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4f8w6"] Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.826049 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-utilities\") pod \"community-operators-4f8w6\" (UID: \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\") " pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.826117 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-catalog-content\") pod \"community-operators-4f8w6\" (UID: \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\") " pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.826154 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh9sv\" (UniqueName: \"kubernetes.io/projected/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-kube-api-access-lh9sv\") pod \"community-operators-4f8w6\" (UID: \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\") " pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.927660 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-catalog-content\") pod \"community-operators-4f8w6\" (UID: \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\") " pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.927757 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh9sv\" (UniqueName: \"kubernetes.io/projected/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-kube-api-access-lh9sv\") pod \"community-operators-4f8w6\" (UID: \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\") " pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.927892 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-utilities\") pod \"community-operators-4f8w6\" (UID: \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\") " pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.928574 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-utilities\") pod \"community-operators-4f8w6\" (UID: \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\") " pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.928977 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-catalog-content\") pod \"community-operators-4f8w6\" (UID: \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\") " pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:22 crc kubenswrapper[4991]: I1006 08:51:22.956493 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh9sv\" (UniqueName: \"kubernetes.io/projected/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-kube-api-access-lh9sv\") pod \"community-operators-4f8w6\" (UID: \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\") " pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:23 crc kubenswrapper[4991]: I1006 08:51:23.123089 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:23 crc kubenswrapper[4991]: I1006 08:51:23.613258 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4f8w6"] Oct 06 08:51:23 crc kubenswrapper[4991]: W1006 08:51:23.625107 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e25c1a3_bc4d_4eed_8e4a_54f7636d7784.slice/crio-725a1359c3947a590ae3eaebdd6acc9237fb31c79320291474df97eebcd2dc8d WatchSource:0}: Error finding container 725a1359c3947a590ae3eaebdd6acc9237fb31c79320291474df97eebcd2dc8d: Status 404 returned error can't find the container with id 725a1359c3947a590ae3eaebdd6acc9237fb31c79320291474df97eebcd2dc8d Oct 06 08:51:23 crc kubenswrapper[4991]: I1006 08:51:23.927759 4991 generic.go:334] "Generic (PLEG): container finished" podID="1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" containerID="669088e4850c02cc8b8ef9005e2baca524a6ca19686f9a7e8aa7d6be531062fa" exitCode=0 Oct 06 08:51:23 crc kubenswrapper[4991]: I1006 08:51:23.927851 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f8w6" event={"ID":"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784","Type":"ContainerDied","Data":"669088e4850c02cc8b8ef9005e2baca524a6ca19686f9a7e8aa7d6be531062fa"} Oct 06 08:51:23 crc kubenswrapper[4991]: I1006 08:51:23.928110 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f8w6" event={"ID":"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784","Type":"ContainerStarted","Data":"725a1359c3947a590ae3eaebdd6acc9237fb31c79320291474df97eebcd2dc8d"} Oct 06 08:51:23 crc kubenswrapper[4991]: I1006 08:51:23.930365 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:51:24 crc kubenswrapper[4991]: I1006 08:51:24.941511 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f8w6" event={"ID":"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784","Type":"ContainerStarted","Data":"4055ca6a59e0a7097501aff58c89a4ff7bb1b66d1a99e4ae983ec08f4b69c40a"} Oct 06 08:51:25 crc kubenswrapper[4991]: I1006 08:51:25.954657 4991 generic.go:334] "Generic (PLEG): container finished" podID="1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" containerID="4055ca6a59e0a7097501aff58c89a4ff7bb1b66d1a99e4ae983ec08f4b69c40a" exitCode=0 Oct 06 08:51:25 crc kubenswrapper[4991]: I1006 08:51:25.954742 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f8w6" event={"ID":"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784","Type":"ContainerDied","Data":"4055ca6a59e0a7097501aff58c89a4ff7bb1b66d1a99e4ae983ec08f4b69c40a"} Oct 06 08:51:26 crc kubenswrapper[4991]: I1006 08:51:26.975363 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f8w6" event={"ID":"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784","Type":"ContainerStarted","Data":"469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76"} Oct 06 08:51:27 crc kubenswrapper[4991]: I1006 08:51:27.001109 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4f8w6" podStartSLOduration=2.572468115 podStartE2EDuration="5.001089622s" podCreationTimestamp="2025-10-06 08:51:22 +0000 UTC" firstStartedPulling="2025-10-06 08:51:23.93001649 +0000 UTC m=+1935.667766521" lastFinishedPulling="2025-10-06 08:51:26.358637987 +0000 UTC m=+1938.096388028" observedRunningTime="2025-10-06 08:51:27.000211578 +0000 UTC m=+1938.737961629" watchObservedRunningTime="2025-10-06 08:51:27.001089622 +0000 UTC m=+1938.738839673" Oct 06 08:51:27 crc kubenswrapper[4991]: I1006 08:51:27.529501 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:51:27 crc kubenswrapper[4991]: I1006 08:51:27.529585 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:51:33 crc kubenswrapper[4991]: I1006 08:51:33.124483 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:33 crc kubenswrapper[4991]: I1006 08:51:33.125607 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:33 crc kubenswrapper[4991]: I1006 08:51:33.198421 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:34 crc kubenswrapper[4991]: I1006 08:51:34.092398 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:34 crc kubenswrapper[4991]: I1006 08:51:34.146762 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4f8w6"] Oct 06 08:51:36 crc kubenswrapper[4991]: I1006 08:51:36.046611 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4f8w6" podUID="1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" containerName="registry-server" containerID="cri-o://469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76" gracePeriod=2 Oct 06 08:51:36 crc kubenswrapper[4991]: I1006 08:51:36.475704 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:36 crc kubenswrapper[4991]: I1006 08:51:36.609151 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-catalog-content\") pod \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\" (UID: \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\") " Oct 06 08:51:36 crc kubenswrapper[4991]: I1006 08:51:36.609381 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-utilities\") pod \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\" (UID: \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\") " Oct 06 08:51:36 crc kubenswrapper[4991]: I1006 08:51:36.609442 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh9sv\" (UniqueName: \"kubernetes.io/projected/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-kube-api-access-lh9sv\") pod \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\" (UID: \"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784\") " Oct 06 08:51:36 crc kubenswrapper[4991]: I1006 08:51:36.610908 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-utilities" (OuterVolumeSpecName: "utilities") pod "1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" (UID: "1e25c1a3-bc4d-4eed-8e4a-54f7636d7784"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:51:36 crc kubenswrapper[4991]: I1006 08:51:36.622418 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-kube-api-access-lh9sv" (OuterVolumeSpecName: "kube-api-access-lh9sv") pod "1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" (UID: "1e25c1a3-bc4d-4eed-8e4a-54f7636d7784"). InnerVolumeSpecName "kube-api-access-lh9sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:51:36 crc kubenswrapper[4991]: I1006 08:51:36.671879 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" (UID: "1e25c1a3-bc4d-4eed-8e4a-54f7636d7784"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:51:36 crc kubenswrapper[4991]: I1006 08:51:36.711715 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:51:36 crc kubenswrapper[4991]: I1006 08:51:36.711932 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh9sv\" (UniqueName: \"kubernetes.io/projected/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-kube-api-access-lh9sv\") on node \"crc\" DevicePath \"\"" Oct 06 08:51:36 crc kubenswrapper[4991]: I1006 08:51:36.712031 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.058906 4991 generic.go:334] "Generic (PLEG): container finished" podID="1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" containerID="469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76" exitCode=0 Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.058986 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4f8w6" Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.058963 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f8w6" event={"ID":"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784","Type":"ContainerDied","Data":"469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76"} Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.059173 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f8w6" event={"ID":"1e25c1a3-bc4d-4eed-8e4a-54f7636d7784","Type":"ContainerDied","Data":"725a1359c3947a590ae3eaebdd6acc9237fb31c79320291474df97eebcd2dc8d"} Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.059226 4991 scope.go:117] "RemoveContainer" containerID="469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76" Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.091150 4991 scope.go:117] "RemoveContainer" containerID="4055ca6a59e0a7097501aff58c89a4ff7bb1b66d1a99e4ae983ec08f4b69c40a" Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.122371 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4f8w6"] Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.123697 4991 scope.go:117] "RemoveContainer" containerID="669088e4850c02cc8b8ef9005e2baca524a6ca19686f9a7e8aa7d6be531062fa" Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.133621 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4f8w6"] Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.169436 4991 scope.go:117] "RemoveContainer" containerID="469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76" Oct 06 08:51:37 crc kubenswrapper[4991]: E1006 08:51:37.170110 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76\": container with ID starting with 469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76 not found: ID does not exist" containerID="469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76" Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.170182 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76"} err="failed to get container status \"469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76\": rpc error: code = NotFound desc = could not find container \"469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76\": container with ID starting with 469d9e6781b12ef40fa93ca10a9871d2d7c257e5b1139e0b70927a0f54653e76 not found: ID does not exist" Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.170232 4991 scope.go:117] "RemoveContainer" containerID="4055ca6a59e0a7097501aff58c89a4ff7bb1b66d1a99e4ae983ec08f4b69c40a" Oct 06 08:51:37 crc kubenswrapper[4991]: E1006 08:51:37.170733 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4055ca6a59e0a7097501aff58c89a4ff7bb1b66d1a99e4ae983ec08f4b69c40a\": container with ID starting with 4055ca6a59e0a7097501aff58c89a4ff7bb1b66d1a99e4ae983ec08f4b69c40a not found: ID does not exist" containerID="4055ca6a59e0a7097501aff58c89a4ff7bb1b66d1a99e4ae983ec08f4b69c40a" Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.170832 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4055ca6a59e0a7097501aff58c89a4ff7bb1b66d1a99e4ae983ec08f4b69c40a"} err="failed to get container status \"4055ca6a59e0a7097501aff58c89a4ff7bb1b66d1a99e4ae983ec08f4b69c40a\": rpc error: code = NotFound desc = could not find container \"4055ca6a59e0a7097501aff58c89a4ff7bb1b66d1a99e4ae983ec08f4b69c40a\": container with ID starting with 4055ca6a59e0a7097501aff58c89a4ff7bb1b66d1a99e4ae983ec08f4b69c40a not found: ID does not exist" Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.170873 4991 scope.go:117] "RemoveContainer" containerID="669088e4850c02cc8b8ef9005e2baca524a6ca19686f9a7e8aa7d6be531062fa" Oct 06 08:51:37 crc kubenswrapper[4991]: E1006 08:51:37.171226 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"669088e4850c02cc8b8ef9005e2baca524a6ca19686f9a7e8aa7d6be531062fa\": container with ID starting with 669088e4850c02cc8b8ef9005e2baca524a6ca19686f9a7e8aa7d6be531062fa not found: ID does not exist" containerID="669088e4850c02cc8b8ef9005e2baca524a6ca19686f9a7e8aa7d6be531062fa" Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.171267 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"669088e4850c02cc8b8ef9005e2baca524a6ca19686f9a7e8aa7d6be531062fa"} err="failed to get container status \"669088e4850c02cc8b8ef9005e2baca524a6ca19686f9a7e8aa7d6be531062fa\": rpc error: code = NotFound desc = could not find container \"669088e4850c02cc8b8ef9005e2baca524a6ca19686f9a7e8aa7d6be531062fa\": container with ID starting with 669088e4850c02cc8b8ef9005e2baca524a6ca19686f9a7e8aa7d6be531062fa not found: ID does not exist" Oct 06 08:51:37 crc kubenswrapper[4991]: I1006 08:51:37.258882 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" path="/var/lib/kubelet/pods/1e25c1a3-bc4d-4eed-8e4a-54f7636d7784/volumes" Oct 06 08:51:57 crc kubenswrapper[4991]: I1006 08:51:57.529659 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:51:57 crc kubenswrapper[4991]: I1006 08:51:57.530992 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:52:27 crc kubenswrapper[4991]: I1006 08:52:27.529927 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:52:27 crc kubenswrapper[4991]: I1006 08:52:27.530768 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:52:27 crc kubenswrapper[4991]: I1006 08:52:27.530851 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:52:27 crc kubenswrapper[4991]: I1006 08:52:27.531751 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90b98db4325250f49a805f3086b07d67be3d9b4c9b074a9983cddc7c950dae26"} pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:52:27 crc kubenswrapper[4991]: I1006 08:52:27.531937 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" containerID="cri-o://90b98db4325250f49a805f3086b07d67be3d9b4c9b074a9983cddc7c950dae26" gracePeriod=600 Oct 06 08:52:28 crc kubenswrapper[4991]: I1006 08:52:28.561146 4991 generic.go:334] "Generic (PLEG): container finished" podID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerID="90b98db4325250f49a805f3086b07d67be3d9b4c9b074a9983cddc7c950dae26" exitCode=0 Oct 06 08:52:28 crc kubenswrapper[4991]: I1006 08:52:28.561196 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerDied","Data":"90b98db4325250f49a805f3086b07d67be3d9b4c9b074a9983cddc7c950dae26"} Oct 06 08:52:28 crc kubenswrapper[4991]: I1006 08:52:28.561807 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb"} Oct 06 08:52:28 crc kubenswrapper[4991]: I1006 08:52:28.561838 4991 scope.go:117] "RemoveContainer" containerID="e4bf11ecc45d74a7202d09762de1801cfc1ed513e1eda694ad55f6df52762e36" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.237997 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-67fcs"] Oct 06 08:52:30 crc kubenswrapper[4991]: E1006 08:52:30.238881 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" containerName="extract-content" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.238900 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" containerName="extract-content" Oct 06 08:52:30 crc kubenswrapper[4991]: E1006 08:52:30.238942 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" containerName="extract-utilities" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.238951 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" containerName="extract-utilities" Oct 06 08:52:30 crc kubenswrapper[4991]: E1006 08:52:30.238967 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" containerName="registry-server" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.238975 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" containerName="registry-server" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.239158 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e25c1a3-bc4d-4eed-8e4a-54f7636d7784" containerName="registry-server" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.240787 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.256227 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-67fcs"] Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.360117 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-catalog-content\") pod \"certified-operators-67fcs\" (UID: \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\") " pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.360185 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt7sg\" (UniqueName: \"kubernetes.io/projected/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-kube-api-access-wt7sg\") pod \"certified-operators-67fcs\" (UID: \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\") " pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.360283 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-utilities\") pod \"certified-operators-67fcs\" (UID: \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\") " pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.462173 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-catalog-content\") pod \"certified-operators-67fcs\" (UID: \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\") " pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.462278 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt7sg\" (UniqueName: \"kubernetes.io/projected/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-kube-api-access-wt7sg\") pod \"certified-operators-67fcs\" (UID: \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\") " pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.462386 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-utilities\") pod \"certified-operators-67fcs\" (UID: \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\") " pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.462989 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-utilities\") pod \"certified-operators-67fcs\" (UID: \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\") " pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.463041 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-catalog-content\") pod \"certified-operators-67fcs\" (UID: \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\") " pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.485801 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt7sg\" (UniqueName: \"kubernetes.io/projected/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-kube-api-access-wt7sg\") pod \"certified-operators-67fcs\" (UID: \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\") " pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.594071 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:30 crc kubenswrapper[4991]: I1006 08:52:30.843259 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-67fcs"] Oct 06 08:52:31 crc kubenswrapper[4991]: I1006 08:52:31.589879 4991 generic.go:334] "Generic (PLEG): container finished" podID="759f6d2e-1946-4af0-bbca-5ff91a6fc49e" containerID="bf88423c99f3cd3acd5e1f1f56053e4d85d12f83c3e2a564249340ae9398f761" exitCode=0 Oct 06 08:52:31 crc kubenswrapper[4991]: I1006 08:52:31.589983 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67fcs" event={"ID":"759f6d2e-1946-4af0-bbca-5ff91a6fc49e","Type":"ContainerDied","Data":"bf88423c99f3cd3acd5e1f1f56053e4d85d12f83c3e2a564249340ae9398f761"} Oct 06 08:52:31 crc kubenswrapper[4991]: I1006 08:52:31.591235 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67fcs" event={"ID":"759f6d2e-1946-4af0-bbca-5ff91a6fc49e","Type":"ContainerStarted","Data":"7bb7e30525a32d2083a0ed788c7243adbcc96829c6d5825bda3d1195c7ff66a5"} Oct 06 08:52:32 crc kubenswrapper[4991]: I1006 08:52:32.600628 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67fcs" event={"ID":"759f6d2e-1946-4af0-bbca-5ff91a6fc49e","Type":"ContainerStarted","Data":"e596c76c150600bfe9a10d64de065d7980bcc88f35bad91ac47b88ab85d093e2"} Oct 06 08:52:33 crc kubenswrapper[4991]: I1006 08:52:33.613285 4991 generic.go:334] "Generic (PLEG): container finished" podID="759f6d2e-1946-4af0-bbca-5ff91a6fc49e" containerID="e596c76c150600bfe9a10d64de065d7980bcc88f35bad91ac47b88ab85d093e2" exitCode=0 Oct 06 08:52:33 crc kubenswrapper[4991]: I1006 08:52:33.613488 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67fcs" event={"ID":"759f6d2e-1946-4af0-bbca-5ff91a6fc49e","Type":"ContainerDied","Data":"e596c76c150600bfe9a10d64de065d7980bcc88f35bad91ac47b88ab85d093e2"} Oct 06 08:52:34 crc kubenswrapper[4991]: I1006 08:52:34.627716 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67fcs" event={"ID":"759f6d2e-1946-4af0-bbca-5ff91a6fc49e","Type":"ContainerStarted","Data":"b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b"} Oct 06 08:52:35 crc kubenswrapper[4991]: I1006 08:52:35.666283 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-67fcs" podStartSLOduration=2.975117704 podStartE2EDuration="5.666262131s" podCreationTimestamp="2025-10-06 08:52:30 +0000 UTC" firstStartedPulling="2025-10-06 08:52:31.591904973 +0000 UTC m=+2003.329655004" lastFinishedPulling="2025-10-06 08:52:34.28304937 +0000 UTC m=+2006.020799431" observedRunningTime="2025-10-06 08:52:35.659521144 +0000 UTC m=+2007.397271175" watchObservedRunningTime="2025-10-06 08:52:35.666262131 +0000 UTC m=+2007.404012152" Oct 06 08:52:40 crc kubenswrapper[4991]: I1006 08:52:40.594793 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:40 crc kubenswrapper[4991]: I1006 08:52:40.595707 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:40 crc kubenswrapper[4991]: I1006 08:52:40.658078 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:40 crc kubenswrapper[4991]: I1006 08:52:40.730809 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:40 crc kubenswrapper[4991]: I1006 08:52:40.903531 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-67fcs"] Oct 06 08:52:42 crc kubenswrapper[4991]: I1006 08:52:42.699545 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-67fcs" podUID="759f6d2e-1946-4af0-bbca-5ff91a6fc49e" containerName="registry-server" containerID="cri-o://b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b" gracePeriod=2 Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.160620 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.260135 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt7sg\" (UniqueName: \"kubernetes.io/projected/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-kube-api-access-wt7sg\") pod \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\" (UID: \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\") " Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.260213 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-utilities\") pod \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\" (UID: \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\") " Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.260275 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-catalog-content\") pod \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\" (UID: \"759f6d2e-1946-4af0-bbca-5ff91a6fc49e\") " Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.261129 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-utilities" (OuterVolumeSpecName: "utilities") pod "759f6d2e-1946-4af0-bbca-5ff91a6fc49e" (UID: "759f6d2e-1946-4af0-bbca-5ff91a6fc49e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.266897 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-kube-api-access-wt7sg" (OuterVolumeSpecName: "kube-api-access-wt7sg") pod "759f6d2e-1946-4af0-bbca-5ff91a6fc49e" (UID: "759f6d2e-1946-4af0-bbca-5ff91a6fc49e"). InnerVolumeSpecName "kube-api-access-wt7sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.323102 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "759f6d2e-1946-4af0-bbca-5ff91a6fc49e" (UID: "759f6d2e-1946-4af0-bbca-5ff91a6fc49e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.338165 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-thv67"] Oct 06 08:52:43 crc kubenswrapper[4991]: E1006 08:52:43.338926 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759f6d2e-1946-4af0-bbca-5ff91a6fc49e" containerName="extract-utilities" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.339020 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="759f6d2e-1946-4af0-bbca-5ff91a6fc49e" containerName="extract-utilities" Oct 06 08:52:43 crc kubenswrapper[4991]: E1006 08:52:43.339048 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759f6d2e-1946-4af0-bbca-5ff91a6fc49e" containerName="registry-server" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.339054 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="759f6d2e-1946-4af0-bbca-5ff91a6fc49e" containerName="registry-server" Oct 06 08:52:43 crc kubenswrapper[4991]: E1006 08:52:43.339078 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759f6d2e-1946-4af0-bbca-5ff91a6fc49e" containerName="extract-content" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.339090 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="759f6d2e-1946-4af0-bbca-5ff91a6fc49e" containerName="extract-content" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.339754 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="759f6d2e-1946-4af0-bbca-5ff91a6fc49e" containerName="registry-server" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.341633 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.356392 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-thv67"] Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.362746 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt7sg\" (UniqueName: \"kubernetes.io/projected/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-kube-api-access-wt7sg\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.362789 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.362804 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f6d2e-1946-4af0-bbca-5ff91a6fc49e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.464191 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2vqt\" (UniqueName: \"kubernetes.io/projected/a0e5f44c-d036-4810-afe5-9557e209c75a-kube-api-access-j2vqt\") pod \"redhat-marketplace-thv67\" (UID: \"a0e5f44c-d036-4810-afe5-9557e209c75a\") " pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.464321 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e5f44c-d036-4810-afe5-9557e209c75a-catalog-content\") pod \"redhat-marketplace-thv67\" (UID: \"a0e5f44c-d036-4810-afe5-9557e209c75a\") " pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.464420 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e5f44c-d036-4810-afe5-9557e209c75a-utilities\") pod \"redhat-marketplace-thv67\" (UID: \"a0e5f44c-d036-4810-afe5-9557e209c75a\") " pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.565979 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e5f44c-d036-4810-afe5-9557e209c75a-catalog-content\") pod \"redhat-marketplace-thv67\" (UID: \"a0e5f44c-d036-4810-afe5-9557e209c75a\") " pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.566077 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e5f44c-d036-4810-afe5-9557e209c75a-utilities\") pod \"redhat-marketplace-thv67\" (UID: \"a0e5f44c-d036-4810-afe5-9557e209c75a\") " pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.566116 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2vqt\" (UniqueName: \"kubernetes.io/projected/a0e5f44c-d036-4810-afe5-9557e209c75a-kube-api-access-j2vqt\") pod \"redhat-marketplace-thv67\" (UID: \"a0e5f44c-d036-4810-afe5-9557e209c75a\") " pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.566538 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e5f44c-d036-4810-afe5-9557e209c75a-catalog-content\") pod \"redhat-marketplace-thv67\" (UID: \"a0e5f44c-d036-4810-afe5-9557e209c75a\") " pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.566602 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e5f44c-d036-4810-afe5-9557e209c75a-utilities\") pod \"redhat-marketplace-thv67\" (UID: \"a0e5f44c-d036-4810-afe5-9557e209c75a\") " pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.582666 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2vqt\" (UniqueName: \"kubernetes.io/projected/a0e5f44c-d036-4810-afe5-9557e209c75a-kube-api-access-j2vqt\") pod \"redhat-marketplace-thv67\" (UID: \"a0e5f44c-d036-4810-afe5-9557e209c75a\") " pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.665048 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.713591 4991 generic.go:334] "Generic (PLEG): container finished" podID="759f6d2e-1946-4af0-bbca-5ff91a6fc49e" containerID="b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b" exitCode=0 Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.713635 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67fcs" event={"ID":"759f6d2e-1946-4af0-bbca-5ff91a6fc49e","Type":"ContainerDied","Data":"b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b"} Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.713666 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-67fcs" event={"ID":"759f6d2e-1946-4af0-bbca-5ff91a6fc49e","Type":"ContainerDied","Data":"7bb7e30525a32d2083a0ed788c7243adbcc96829c6d5825bda3d1195c7ff66a5"} Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.713685 4991 scope.go:117] "RemoveContainer" containerID="b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.713892 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-67fcs" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.743336 4991 scope.go:117] "RemoveContainer" containerID="e596c76c150600bfe9a10d64de065d7980bcc88f35bad91ac47b88ab85d093e2" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.759650 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-67fcs"] Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.766711 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-67fcs"] Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.778185 4991 scope.go:117] "RemoveContainer" containerID="bf88423c99f3cd3acd5e1f1f56053e4d85d12f83c3e2a564249340ae9398f761" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.797342 4991 scope.go:117] "RemoveContainer" containerID="b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b" Oct 06 08:52:43 crc kubenswrapper[4991]: E1006 08:52:43.797894 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b\": container with ID starting with b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b not found: ID does not exist" containerID="b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.797938 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b"} err="failed to get container status \"b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b\": rpc error: code = NotFound desc = could not find container \"b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b\": container with ID starting with b65cdc88be9951d3032827d9034e38d1734fcf2b75839e9d2b1968fbdc5dbe8b not found: ID does not exist" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.797967 4991 scope.go:117] "RemoveContainer" containerID="e596c76c150600bfe9a10d64de065d7980bcc88f35bad91ac47b88ab85d093e2" Oct 06 08:52:43 crc kubenswrapper[4991]: E1006 08:52:43.798442 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e596c76c150600bfe9a10d64de065d7980bcc88f35bad91ac47b88ab85d093e2\": container with ID starting with e596c76c150600bfe9a10d64de065d7980bcc88f35bad91ac47b88ab85d093e2 not found: ID does not exist" containerID="e596c76c150600bfe9a10d64de065d7980bcc88f35bad91ac47b88ab85d093e2" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.798541 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e596c76c150600bfe9a10d64de065d7980bcc88f35bad91ac47b88ab85d093e2"} err="failed to get container status \"e596c76c150600bfe9a10d64de065d7980bcc88f35bad91ac47b88ab85d093e2\": rpc error: code = NotFound desc = could not find container \"e596c76c150600bfe9a10d64de065d7980bcc88f35bad91ac47b88ab85d093e2\": container with ID starting with e596c76c150600bfe9a10d64de065d7980bcc88f35bad91ac47b88ab85d093e2 not found: ID does not exist" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.798561 4991 scope.go:117] "RemoveContainer" containerID="bf88423c99f3cd3acd5e1f1f56053e4d85d12f83c3e2a564249340ae9398f761" Oct 06 08:52:43 crc kubenswrapper[4991]: E1006 08:52:43.799046 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf88423c99f3cd3acd5e1f1f56053e4d85d12f83c3e2a564249340ae9398f761\": container with ID starting with bf88423c99f3cd3acd5e1f1f56053e4d85d12f83c3e2a564249340ae9398f761 not found: ID does not exist" containerID="bf88423c99f3cd3acd5e1f1f56053e4d85d12f83c3e2a564249340ae9398f761" Oct 06 08:52:43 crc kubenswrapper[4991]: I1006 08:52:43.799072 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf88423c99f3cd3acd5e1f1f56053e4d85d12f83c3e2a564249340ae9398f761"} err="failed to get container status \"bf88423c99f3cd3acd5e1f1f56053e4d85d12f83c3e2a564249340ae9398f761\": rpc error: code = NotFound desc = could not find container \"bf88423c99f3cd3acd5e1f1f56053e4d85d12f83c3e2a564249340ae9398f761\": container with ID starting with bf88423c99f3cd3acd5e1f1f56053e4d85d12f83c3e2a564249340ae9398f761 not found: ID does not exist" Oct 06 08:52:44 crc kubenswrapper[4991]: I1006 08:52:44.103817 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-thv67"] Oct 06 08:52:44 crc kubenswrapper[4991]: W1006 08:52:44.105392 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0e5f44c_d036_4810_afe5_9557e209c75a.slice/crio-98e5101a00053abdadee3dd1401193e8ed4eebf11bd3617176b190a8e719b36e WatchSource:0}: Error finding container 98e5101a00053abdadee3dd1401193e8ed4eebf11bd3617176b190a8e719b36e: Status 404 returned error can't find the container with id 98e5101a00053abdadee3dd1401193e8ed4eebf11bd3617176b190a8e719b36e Oct 06 08:52:44 crc kubenswrapper[4991]: I1006 08:52:44.721582 4991 generic.go:334] "Generic (PLEG): container finished" podID="a0e5f44c-d036-4810-afe5-9557e209c75a" containerID="47d7f62756da3785a497e358aff76f23a44efe3a5b39ecb3f36964720decac05" exitCode=0 Oct 06 08:52:44 crc kubenswrapper[4991]: I1006 08:52:44.721622 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thv67" event={"ID":"a0e5f44c-d036-4810-afe5-9557e209c75a","Type":"ContainerDied","Data":"47d7f62756da3785a497e358aff76f23a44efe3a5b39ecb3f36964720decac05"} Oct 06 08:52:44 crc kubenswrapper[4991]: I1006 08:52:44.721648 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thv67" event={"ID":"a0e5f44c-d036-4810-afe5-9557e209c75a","Type":"ContainerStarted","Data":"98e5101a00053abdadee3dd1401193e8ed4eebf11bd3617176b190a8e719b36e"} Oct 06 08:52:45 crc kubenswrapper[4991]: I1006 08:52:45.262041 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="759f6d2e-1946-4af0-bbca-5ff91a6fc49e" path="/var/lib/kubelet/pods/759f6d2e-1946-4af0-bbca-5ff91a6fc49e/volumes" Oct 06 08:52:45 crc kubenswrapper[4991]: I1006 08:52:45.731126 4991 generic.go:334] "Generic (PLEG): container finished" podID="a0e5f44c-d036-4810-afe5-9557e209c75a" containerID="b9eb66302524d6240418b657cb03990a705dcd6a0ab231b5d54a7b1d6a255a9a" exitCode=0 Oct 06 08:52:45 crc kubenswrapper[4991]: I1006 08:52:45.731197 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thv67" event={"ID":"a0e5f44c-d036-4810-afe5-9557e209c75a","Type":"ContainerDied","Data":"b9eb66302524d6240418b657cb03990a705dcd6a0ab231b5d54a7b1d6a255a9a"} Oct 06 08:52:46 crc kubenswrapper[4991]: I1006 08:52:46.741102 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thv67" event={"ID":"a0e5f44c-d036-4810-afe5-9557e209c75a","Type":"ContainerStarted","Data":"4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b"} Oct 06 08:52:46 crc kubenswrapper[4991]: I1006 08:52:46.759665 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-thv67" podStartSLOduration=2.329856034 podStartE2EDuration="3.75964106s" podCreationTimestamp="2025-10-06 08:52:43 +0000 UTC" firstStartedPulling="2025-10-06 08:52:44.723712824 +0000 UTC m=+2016.461462845" lastFinishedPulling="2025-10-06 08:52:46.15349785 +0000 UTC m=+2017.891247871" observedRunningTime="2025-10-06 08:52:46.757089709 +0000 UTC m=+2018.494839740" watchObservedRunningTime="2025-10-06 08:52:46.75964106 +0000 UTC m=+2018.497391121" Oct 06 08:52:53 crc kubenswrapper[4991]: I1006 08:52:53.665614 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:53 crc kubenswrapper[4991]: I1006 08:52:53.667729 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:53 crc kubenswrapper[4991]: I1006 08:52:53.737989 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:53 crc kubenswrapper[4991]: I1006 08:52:53.861867 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:53 crc kubenswrapper[4991]: I1006 08:52:53.982944 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-thv67"] Oct 06 08:52:55 crc kubenswrapper[4991]: I1006 08:52:55.826523 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-thv67" podUID="a0e5f44c-d036-4810-afe5-9557e209c75a" containerName="registry-server" containerID="cri-o://4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b" gracePeriod=2 Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.260392 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.362567 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e5f44c-d036-4810-afe5-9557e209c75a-utilities\") pod \"a0e5f44c-d036-4810-afe5-9557e209c75a\" (UID: \"a0e5f44c-d036-4810-afe5-9557e209c75a\") " Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.362681 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e5f44c-d036-4810-afe5-9557e209c75a-catalog-content\") pod \"a0e5f44c-d036-4810-afe5-9557e209c75a\" (UID: \"a0e5f44c-d036-4810-afe5-9557e209c75a\") " Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.362840 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2vqt\" (UniqueName: \"kubernetes.io/projected/a0e5f44c-d036-4810-afe5-9557e209c75a-kube-api-access-j2vqt\") pod \"a0e5f44c-d036-4810-afe5-9557e209c75a\" (UID: \"a0e5f44c-d036-4810-afe5-9557e209c75a\") " Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.364897 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0e5f44c-d036-4810-afe5-9557e209c75a-utilities" (OuterVolumeSpecName: "utilities") pod "a0e5f44c-d036-4810-afe5-9557e209c75a" (UID: "a0e5f44c-d036-4810-afe5-9557e209c75a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.371695 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e5f44c-d036-4810-afe5-9557e209c75a-kube-api-access-j2vqt" (OuterVolumeSpecName: "kube-api-access-j2vqt") pod "a0e5f44c-d036-4810-afe5-9557e209c75a" (UID: "a0e5f44c-d036-4810-afe5-9557e209c75a"). InnerVolumeSpecName "kube-api-access-j2vqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.386894 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0e5f44c-d036-4810-afe5-9557e209c75a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0e5f44c-d036-4810-afe5-9557e209c75a" (UID: "a0e5f44c-d036-4810-afe5-9557e209c75a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.464797 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0e5f44c-d036-4810-afe5-9557e209c75a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.464847 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0e5f44c-d036-4810-afe5-9557e209c75a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.464863 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2vqt\" (UniqueName: \"kubernetes.io/projected/a0e5f44c-d036-4810-afe5-9557e209c75a-kube-api-access-j2vqt\") on node \"crc\" DevicePath \"\"" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.850708 4991 generic.go:334] "Generic (PLEG): container finished" podID="a0e5f44c-d036-4810-afe5-9557e209c75a" containerID="4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b" exitCode=0 Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.850751 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thv67" event={"ID":"a0e5f44c-d036-4810-afe5-9557e209c75a","Type":"ContainerDied","Data":"4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b"} Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.850814 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-thv67" event={"ID":"a0e5f44c-d036-4810-afe5-9557e209c75a","Type":"ContainerDied","Data":"98e5101a00053abdadee3dd1401193e8ed4eebf11bd3617176b190a8e719b36e"} Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.850843 4991 scope.go:117] "RemoveContainer" containerID="4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.850840 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-thv67" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.884968 4991 scope.go:117] "RemoveContainer" containerID="b9eb66302524d6240418b657cb03990a705dcd6a0ab231b5d54a7b1d6a255a9a" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.905958 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-thv67"] Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.915079 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-thv67"] Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.923758 4991 scope.go:117] "RemoveContainer" containerID="47d7f62756da3785a497e358aff76f23a44efe3a5b39ecb3f36964720decac05" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.959168 4991 scope.go:117] "RemoveContainer" containerID="4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b" Oct 06 08:52:56 crc kubenswrapper[4991]: E1006 08:52:56.959615 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b\": container with ID starting with 4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b not found: ID does not exist" containerID="4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.959659 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b"} err="failed to get container status \"4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b\": rpc error: code = NotFound desc = could not find container \"4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b\": container with ID starting with 4c02531a4d95fd98efae3e9f9c8e60de010d83698fce26ca9a462cb76ff3247b not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.959689 4991 scope.go:117] "RemoveContainer" containerID="b9eb66302524d6240418b657cb03990a705dcd6a0ab231b5d54a7b1d6a255a9a" Oct 06 08:52:56 crc kubenswrapper[4991]: E1006 08:52:56.960068 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9eb66302524d6240418b657cb03990a705dcd6a0ab231b5d54a7b1d6a255a9a\": container with ID starting with b9eb66302524d6240418b657cb03990a705dcd6a0ab231b5d54a7b1d6a255a9a not found: ID does not exist" containerID="b9eb66302524d6240418b657cb03990a705dcd6a0ab231b5d54a7b1d6a255a9a" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.960143 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9eb66302524d6240418b657cb03990a705dcd6a0ab231b5d54a7b1d6a255a9a"} err="failed to get container status \"b9eb66302524d6240418b657cb03990a705dcd6a0ab231b5d54a7b1d6a255a9a\": rpc error: code = NotFound desc = could not find container \"b9eb66302524d6240418b657cb03990a705dcd6a0ab231b5d54a7b1d6a255a9a\": container with ID starting with b9eb66302524d6240418b657cb03990a705dcd6a0ab231b5d54a7b1d6a255a9a not found: ID does not exist" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.960186 4991 scope.go:117] "RemoveContainer" containerID="47d7f62756da3785a497e358aff76f23a44efe3a5b39ecb3f36964720decac05" Oct 06 08:52:56 crc kubenswrapper[4991]: E1006 08:52:56.960863 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d7f62756da3785a497e358aff76f23a44efe3a5b39ecb3f36964720decac05\": container with ID starting with 47d7f62756da3785a497e358aff76f23a44efe3a5b39ecb3f36964720decac05 not found: ID does not exist" containerID="47d7f62756da3785a497e358aff76f23a44efe3a5b39ecb3f36964720decac05" Oct 06 08:52:56 crc kubenswrapper[4991]: I1006 08:52:56.960897 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d7f62756da3785a497e358aff76f23a44efe3a5b39ecb3f36964720decac05"} err="failed to get container status \"47d7f62756da3785a497e358aff76f23a44efe3a5b39ecb3f36964720decac05\": rpc error: code = NotFound desc = could not find container \"47d7f62756da3785a497e358aff76f23a44efe3a5b39ecb3f36964720decac05\": container with ID starting with 47d7f62756da3785a497e358aff76f23a44efe3a5b39ecb3f36964720decac05 not found: ID does not exist" Oct 06 08:52:57 crc kubenswrapper[4991]: I1006 08:52:57.261821 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e5f44c-d036-4810-afe5-9557e209c75a" path="/var/lib/kubelet/pods/a0e5f44c-d036-4810-afe5-9557e209c75a/volumes" Oct 06 08:54:27 crc kubenswrapper[4991]: I1006 08:54:27.529559 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:54:27 crc kubenswrapper[4991]: I1006 08:54:27.530329 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:54:57 crc kubenswrapper[4991]: I1006 08:54:57.529085 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:54:57 crc kubenswrapper[4991]: I1006 08:54:57.529815 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:55:27 crc kubenswrapper[4991]: I1006 08:55:27.529797 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 08:55:27 crc kubenswrapper[4991]: I1006 08:55:27.530581 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 08:55:27 crc kubenswrapper[4991]: I1006 08:55:27.530643 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 08:55:27 crc kubenswrapper[4991]: I1006 08:55:27.531582 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb"} pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 08:55:27 crc kubenswrapper[4991]: I1006 08:55:27.531707 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" containerID="cri-o://5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" gracePeriod=600 Oct 06 08:55:27 crc kubenswrapper[4991]: E1006 08:55:27.680460 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:55:28 crc kubenswrapper[4991]: I1006 08:55:28.279732 4991 generic.go:334] "Generic (PLEG): container finished" podID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" exitCode=0 Oct 06 08:55:28 crc kubenswrapper[4991]: I1006 08:55:28.279803 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerDied","Data":"5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb"} Oct 06 08:55:28 crc kubenswrapper[4991]: I1006 08:55:28.279854 4991 scope.go:117] "RemoveContainer" containerID="90b98db4325250f49a805f3086b07d67be3d9b4c9b074a9983cddc7c950dae26" Oct 06 08:55:28 crc kubenswrapper[4991]: I1006 08:55:28.280792 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:55:28 crc kubenswrapper[4991]: E1006 08:55:28.281168 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:55:40 crc kubenswrapper[4991]: I1006 08:55:40.244069 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:55:40 crc kubenswrapper[4991]: E1006 08:55:40.244978 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:55:53 crc kubenswrapper[4991]: I1006 08:55:53.243788 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:55:53 crc kubenswrapper[4991]: E1006 08:55:53.244973 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:56:08 crc kubenswrapper[4991]: I1006 08:56:08.243732 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:56:08 crc kubenswrapper[4991]: E1006 08:56:08.244678 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:56:19 crc kubenswrapper[4991]: I1006 08:56:19.249655 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:56:19 crc kubenswrapper[4991]: E1006 08:56:19.250848 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:56:33 crc kubenswrapper[4991]: I1006 08:56:33.244408 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:56:33 crc kubenswrapper[4991]: E1006 08:56:33.245203 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:56:48 crc kubenswrapper[4991]: I1006 08:56:48.246358 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:56:48 crc kubenswrapper[4991]: E1006 08:56:48.247151 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:56:53 crc kubenswrapper[4991]: I1006 08:56:53.808864 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-56tvw"] Oct 06 08:56:53 crc kubenswrapper[4991]: E1006 08:56:53.810246 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e5f44c-d036-4810-afe5-9557e209c75a" containerName="extract-utilities" Oct 06 08:56:53 crc kubenswrapper[4991]: I1006 08:56:53.810274 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e5f44c-d036-4810-afe5-9557e209c75a" containerName="extract-utilities" Oct 06 08:56:53 crc kubenswrapper[4991]: E1006 08:56:53.810330 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e5f44c-d036-4810-afe5-9557e209c75a" containerName="extract-content" Oct 06 08:56:53 crc kubenswrapper[4991]: I1006 08:56:53.810347 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e5f44c-d036-4810-afe5-9557e209c75a" containerName="extract-content" Oct 06 08:56:53 crc kubenswrapper[4991]: E1006 08:56:53.810395 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e5f44c-d036-4810-afe5-9557e209c75a" containerName="registry-server" Oct 06 08:56:53 crc kubenswrapper[4991]: I1006 08:56:53.810413 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e5f44c-d036-4810-afe5-9557e209c75a" containerName="registry-server" Oct 06 08:56:53 crc kubenswrapper[4991]: I1006 08:56:53.810782 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e5f44c-d036-4810-afe5-9557e209c75a" containerName="registry-server" Oct 06 08:56:53 crc kubenswrapper[4991]: I1006 08:56:53.814498 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:56:53 crc kubenswrapper[4991]: I1006 08:56:53.829469 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56tvw"] Oct 06 08:56:53 crc kubenswrapper[4991]: I1006 08:56:53.933543 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7a31134-1417-47c9-83b0-150edc8d56c0-catalog-content\") pod \"redhat-operators-56tvw\" (UID: \"d7a31134-1417-47c9-83b0-150edc8d56c0\") " pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:56:53 crc kubenswrapper[4991]: I1006 08:56:53.933626 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ql9v\" (UniqueName: \"kubernetes.io/projected/d7a31134-1417-47c9-83b0-150edc8d56c0-kube-api-access-5ql9v\") pod \"redhat-operators-56tvw\" (UID: \"d7a31134-1417-47c9-83b0-150edc8d56c0\") " pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:56:53 crc kubenswrapper[4991]: I1006 08:56:53.933696 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7a31134-1417-47c9-83b0-150edc8d56c0-utilities\") pod \"redhat-operators-56tvw\" (UID: \"d7a31134-1417-47c9-83b0-150edc8d56c0\") " pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:56:54 crc kubenswrapper[4991]: I1006 08:56:54.035175 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7a31134-1417-47c9-83b0-150edc8d56c0-utilities\") pod \"redhat-operators-56tvw\" (UID: \"d7a31134-1417-47c9-83b0-150edc8d56c0\") " pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:56:54 crc kubenswrapper[4991]: I1006 08:56:54.035318 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7a31134-1417-47c9-83b0-150edc8d56c0-catalog-content\") pod \"redhat-operators-56tvw\" (UID: \"d7a31134-1417-47c9-83b0-150edc8d56c0\") " pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:56:54 crc kubenswrapper[4991]: I1006 08:56:54.035351 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ql9v\" (UniqueName: \"kubernetes.io/projected/d7a31134-1417-47c9-83b0-150edc8d56c0-kube-api-access-5ql9v\") pod \"redhat-operators-56tvw\" (UID: \"d7a31134-1417-47c9-83b0-150edc8d56c0\") " pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:56:54 crc kubenswrapper[4991]: I1006 08:56:54.035891 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7a31134-1417-47c9-83b0-150edc8d56c0-utilities\") pod \"redhat-operators-56tvw\" (UID: \"d7a31134-1417-47c9-83b0-150edc8d56c0\") " pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:56:54 crc kubenswrapper[4991]: I1006 08:56:54.036129 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7a31134-1417-47c9-83b0-150edc8d56c0-catalog-content\") pod \"redhat-operators-56tvw\" (UID: \"d7a31134-1417-47c9-83b0-150edc8d56c0\") " pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:56:54 crc kubenswrapper[4991]: I1006 08:56:54.055235 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ql9v\" (UniqueName: \"kubernetes.io/projected/d7a31134-1417-47c9-83b0-150edc8d56c0-kube-api-access-5ql9v\") pod \"redhat-operators-56tvw\" (UID: \"d7a31134-1417-47c9-83b0-150edc8d56c0\") " pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:56:54 crc kubenswrapper[4991]: I1006 08:56:54.185241 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:56:54 crc kubenswrapper[4991]: I1006 08:56:54.624686 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56tvw"] Oct 06 08:56:54 crc kubenswrapper[4991]: W1006 08:56:54.641458 4991 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a31134_1417_47c9_83b0_150edc8d56c0.slice/crio-a6c397739ab1a86c15a105036f09d193550c6e3bc2c9d67e0b067e277c60ad5d WatchSource:0}: Error finding container a6c397739ab1a86c15a105036f09d193550c6e3bc2c9d67e0b067e277c60ad5d: Status 404 returned error can't find the container with id a6c397739ab1a86c15a105036f09d193550c6e3bc2c9d67e0b067e277c60ad5d Oct 06 08:56:55 crc kubenswrapper[4991]: I1006 08:56:55.055778 4991 generic.go:334] "Generic (PLEG): container finished" podID="d7a31134-1417-47c9-83b0-150edc8d56c0" containerID="aacd0031848d261db685da69ae9e3501bc95cce85994385cbde944bfe09d4869" exitCode=0 Oct 06 08:56:55 crc kubenswrapper[4991]: I1006 08:56:55.055876 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56tvw" event={"ID":"d7a31134-1417-47c9-83b0-150edc8d56c0","Type":"ContainerDied","Data":"aacd0031848d261db685da69ae9e3501bc95cce85994385cbde944bfe09d4869"} Oct 06 08:56:55 crc kubenswrapper[4991]: I1006 08:56:55.056022 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56tvw" event={"ID":"d7a31134-1417-47c9-83b0-150edc8d56c0","Type":"ContainerStarted","Data":"a6c397739ab1a86c15a105036f09d193550c6e3bc2c9d67e0b067e277c60ad5d"} Oct 06 08:56:55 crc kubenswrapper[4991]: I1006 08:56:55.058695 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 08:56:57 crc kubenswrapper[4991]: I1006 08:56:57.075589 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56tvw" event={"ID":"d7a31134-1417-47c9-83b0-150edc8d56c0","Type":"ContainerStarted","Data":"0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a"} Oct 06 08:56:57 crc kubenswrapper[4991]: E1006 08:56:57.303652 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a31134_1417_47c9_83b0_150edc8d56c0.slice/crio-conmon-0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a31134_1417_47c9_83b0_150edc8d56c0.slice/crio-0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a.scope\": RecentStats: unable to find data in memory cache]" Oct 06 08:56:58 crc kubenswrapper[4991]: I1006 08:56:58.090536 4991 generic.go:334] "Generic (PLEG): container finished" podID="d7a31134-1417-47c9-83b0-150edc8d56c0" containerID="0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a" exitCode=0 Oct 06 08:56:58 crc kubenswrapper[4991]: I1006 08:56:58.090602 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56tvw" event={"ID":"d7a31134-1417-47c9-83b0-150edc8d56c0","Type":"ContainerDied","Data":"0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a"} Oct 06 08:56:59 crc kubenswrapper[4991]: I1006 08:56:59.103034 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56tvw" event={"ID":"d7a31134-1417-47c9-83b0-150edc8d56c0","Type":"ContainerStarted","Data":"86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e"} Oct 06 08:56:59 crc kubenswrapper[4991]: I1006 08:56:59.141738 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-56tvw" podStartSLOduration=2.746191026 podStartE2EDuration="6.141709494s" podCreationTimestamp="2025-10-06 08:56:53 +0000 UTC" firstStartedPulling="2025-10-06 08:56:55.058122101 +0000 UTC m=+2266.795872152" lastFinishedPulling="2025-10-06 08:56:58.453640589 +0000 UTC m=+2270.191390620" observedRunningTime="2025-10-06 08:56:59.121789996 +0000 UTC m=+2270.859540047" watchObservedRunningTime="2025-10-06 08:56:59.141709494 +0000 UTC m=+2270.879459545" Oct 06 08:57:03 crc kubenswrapper[4991]: I1006 08:57:03.243427 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:57:03 crc kubenswrapper[4991]: E1006 08:57:03.243866 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:57:04 crc kubenswrapper[4991]: I1006 08:57:04.186136 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:57:04 crc kubenswrapper[4991]: I1006 08:57:04.186610 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:57:04 crc kubenswrapper[4991]: I1006 08:57:04.265206 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:57:05 crc kubenswrapper[4991]: I1006 08:57:05.227758 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:57:05 crc kubenswrapper[4991]: I1006 08:57:05.285644 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56tvw"] Oct 06 08:57:07 crc kubenswrapper[4991]: I1006 08:57:07.179322 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-56tvw" podUID="d7a31134-1417-47c9-83b0-150edc8d56c0" containerName="registry-server" containerID="cri-o://86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e" gracePeriod=2 Oct 06 08:57:07 crc kubenswrapper[4991]: I1006 08:57:07.642800 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:57:07 crc kubenswrapper[4991]: I1006 08:57:07.735752 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7a31134-1417-47c9-83b0-150edc8d56c0-catalog-content\") pod \"d7a31134-1417-47c9-83b0-150edc8d56c0\" (UID: \"d7a31134-1417-47c9-83b0-150edc8d56c0\") " Oct 06 08:57:07 crc kubenswrapper[4991]: I1006 08:57:07.735857 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ql9v\" (UniqueName: \"kubernetes.io/projected/d7a31134-1417-47c9-83b0-150edc8d56c0-kube-api-access-5ql9v\") pod \"d7a31134-1417-47c9-83b0-150edc8d56c0\" (UID: \"d7a31134-1417-47c9-83b0-150edc8d56c0\") " Oct 06 08:57:07 crc kubenswrapper[4991]: I1006 08:57:07.735932 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7a31134-1417-47c9-83b0-150edc8d56c0-utilities\") pod \"d7a31134-1417-47c9-83b0-150edc8d56c0\" (UID: \"d7a31134-1417-47c9-83b0-150edc8d56c0\") " Oct 06 08:57:07 crc kubenswrapper[4991]: I1006 08:57:07.737966 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7a31134-1417-47c9-83b0-150edc8d56c0-utilities" (OuterVolumeSpecName: "utilities") pod "d7a31134-1417-47c9-83b0-150edc8d56c0" (UID: "d7a31134-1417-47c9-83b0-150edc8d56c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:57:07 crc kubenswrapper[4991]: I1006 08:57:07.746225 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a31134-1417-47c9-83b0-150edc8d56c0-kube-api-access-5ql9v" (OuterVolumeSpecName: "kube-api-access-5ql9v") pod "d7a31134-1417-47c9-83b0-150edc8d56c0" (UID: "d7a31134-1417-47c9-83b0-150edc8d56c0"). InnerVolumeSpecName "kube-api-access-5ql9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 08:57:07 crc kubenswrapper[4991]: I1006 08:57:07.821004 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7a31134-1417-47c9-83b0-150edc8d56c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7a31134-1417-47c9-83b0-150edc8d56c0" (UID: "d7a31134-1417-47c9-83b0-150edc8d56c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 08:57:07 crc kubenswrapper[4991]: I1006 08:57:07.840623 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7a31134-1417-47c9-83b0-150edc8d56c0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:07 crc kubenswrapper[4991]: I1006 08:57:07.840696 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ql9v\" (UniqueName: \"kubernetes.io/projected/d7a31134-1417-47c9-83b0-150edc8d56c0-kube-api-access-5ql9v\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:07 crc kubenswrapper[4991]: I1006 08:57:07.840725 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7a31134-1417-47c9-83b0-150edc8d56c0-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.190753 4991 generic.go:334] "Generic (PLEG): container finished" podID="d7a31134-1417-47c9-83b0-150edc8d56c0" containerID="86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e" exitCode=0 Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.190842 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56tvw" event={"ID":"d7a31134-1417-47c9-83b0-150edc8d56c0","Type":"ContainerDied","Data":"86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e"} Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.190933 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56tvw" event={"ID":"d7a31134-1417-47c9-83b0-150edc8d56c0","Type":"ContainerDied","Data":"a6c397739ab1a86c15a105036f09d193550c6e3bc2c9d67e0b067e277c60ad5d"} Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.190972 4991 scope.go:117] "RemoveContainer" containerID="86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e" Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.192513 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56tvw" Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.211119 4991 scope.go:117] "RemoveContainer" containerID="0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a" Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.250141 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56tvw"] Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.251671 4991 scope.go:117] "RemoveContainer" containerID="aacd0031848d261db685da69ae9e3501bc95cce85994385cbde944bfe09d4869" Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.257976 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-56tvw"] Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.289706 4991 scope.go:117] "RemoveContainer" containerID="86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e" Oct 06 08:57:08 crc kubenswrapper[4991]: E1006 08:57:08.290080 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e\": container with ID starting with 86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e not found: ID does not exist" containerID="86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e" Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.290117 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e"} err="failed to get container status \"86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e\": rpc error: code = NotFound desc = could not find container \"86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e\": container with ID starting with 86bcd482fe013463f51325792e5092b69f7660dba445d832302c0907b3e7957e not found: ID does not exist" Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.290142 4991 scope.go:117] "RemoveContainer" containerID="0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a" Oct 06 08:57:08 crc kubenswrapper[4991]: E1006 08:57:08.290717 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a\": container with ID starting with 0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a not found: ID does not exist" containerID="0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a" Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.290750 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a"} err="failed to get container status \"0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a\": rpc error: code = NotFound desc = could not find container \"0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a\": container with ID starting with 0f4cf8afb476916287467783dce7d8c30b0efa35ac75ba0b7ffed54590f4654a not found: ID does not exist" Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.290788 4991 scope.go:117] "RemoveContainer" containerID="aacd0031848d261db685da69ae9e3501bc95cce85994385cbde944bfe09d4869" Oct 06 08:57:08 crc kubenswrapper[4991]: E1006 08:57:08.291083 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aacd0031848d261db685da69ae9e3501bc95cce85994385cbde944bfe09d4869\": container with ID starting with aacd0031848d261db685da69ae9e3501bc95cce85994385cbde944bfe09d4869 not found: ID does not exist" containerID="aacd0031848d261db685da69ae9e3501bc95cce85994385cbde944bfe09d4869" Oct 06 08:57:08 crc kubenswrapper[4991]: I1006 08:57:08.291119 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aacd0031848d261db685da69ae9e3501bc95cce85994385cbde944bfe09d4869"} err="failed to get container status \"aacd0031848d261db685da69ae9e3501bc95cce85994385cbde944bfe09d4869\": rpc error: code = NotFound desc = could not find container \"aacd0031848d261db685da69ae9e3501bc95cce85994385cbde944bfe09d4869\": container with ID starting with aacd0031848d261db685da69ae9e3501bc95cce85994385cbde944bfe09d4869 not found: ID does not exist" Oct 06 08:57:09 crc kubenswrapper[4991]: I1006 08:57:09.258119 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a31134-1417-47c9-83b0-150edc8d56c0" path="/var/lib/kubelet/pods/d7a31134-1417-47c9-83b0-150edc8d56c0/volumes" Oct 06 08:57:17 crc kubenswrapper[4991]: I1006 08:57:17.244215 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:57:17 crc kubenswrapper[4991]: E1006 08:57:17.244910 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:57:32 crc kubenswrapper[4991]: I1006 08:57:32.243731 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:57:32 crc kubenswrapper[4991]: E1006 08:57:32.244478 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:57:44 crc kubenswrapper[4991]: I1006 08:57:44.243737 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:57:44 crc kubenswrapper[4991]: E1006 08:57:44.244758 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:57:59 crc kubenswrapper[4991]: I1006 08:57:59.257615 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:57:59 crc kubenswrapper[4991]: E1006 08:57:59.259863 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:58:10 crc kubenswrapper[4991]: I1006 08:58:10.244337 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:58:10 crc kubenswrapper[4991]: E1006 08:58:10.245377 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:58:23 crc kubenswrapper[4991]: I1006 08:58:23.245271 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:58:23 crc kubenswrapper[4991]: E1006 08:58:23.246581 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:58:37 crc kubenswrapper[4991]: I1006 08:58:37.243696 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:58:37 crc kubenswrapper[4991]: E1006 08:58:37.244428 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:58:52 crc kubenswrapper[4991]: I1006 08:58:52.244686 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:58:52 crc kubenswrapper[4991]: E1006 08:58:52.245346 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:59:04 crc kubenswrapper[4991]: I1006 08:59:04.244529 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:59:04 crc kubenswrapper[4991]: E1006 08:59:04.245172 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:59:16 crc kubenswrapper[4991]: I1006 08:59:16.244022 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:59:16 crc kubenswrapper[4991]: E1006 08:59:16.244698 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:59:27 crc kubenswrapper[4991]: I1006 08:59:27.243727 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:59:27 crc kubenswrapper[4991]: E1006 08:59:27.244461 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:59:42 crc kubenswrapper[4991]: I1006 08:59:42.244203 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:59:42 crc kubenswrapper[4991]: E1006 08:59:42.245593 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 08:59:56 crc kubenswrapper[4991]: I1006 08:59:56.244422 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 08:59:56 crc kubenswrapper[4991]: E1006 08:59:56.247078 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.164680 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc"] Oct 06 09:00:00 crc kubenswrapper[4991]: E1006 09:00:00.165655 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a31134-1417-47c9-83b0-150edc8d56c0" containerName="extract-content" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.165689 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a31134-1417-47c9-83b0-150edc8d56c0" containerName="extract-content" Oct 06 09:00:00 crc kubenswrapper[4991]: E1006 09:00:00.165722 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a31134-1417-47c9-83b0-150edc8d56c0" containerName="extract-utilities" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.165772 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a31134-1417-47c9-83b0-150edc8d56c0" containerName="extract-utilities" Oct 06 09:00:00 crc kubenswrapper[4991]: E1006 09:00:00.165835 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a31134-1417-47c9-83b0-150edc8d56c0" containerName="registry-server" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.165856 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a31134-1417-47c9-83b0-150edc8d56c0" containerName="registry-server" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.166243 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a31134-1417-47c9-83b0-150edc8d56c0" containerName="registry-server" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.167347 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.169482 4991 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.169721 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.198076 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc"] Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.284213 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbgxv\" (UniqueName: \"kubernetes.io/projected/698d3347-128f-49c9-b02a-ea2006d7761e-kube-api-access-tbgxv\") pod \"collect-profiles-29329020-z5jnc\" (UID: \"698d3347-128f-49c9-b02a-ea2006d7761e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.284329 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/698d3347-128f-49c9-b02a-ea2006d7761e-config-volume\") pod \"collect-profiles-29329020-z5jnc\" (UID: \"698d3347-128f-49c9-b02a-ea2006d7761e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.284377 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/698d3347-128f-49c9-b02a-ea2006d7761e-secret-volume\") pod \"collect-profiles-29329020-z5jnc\" (UID: \"698d3347-128f-49c9-b02a-ea2006d7761e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.385847 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/698d3347-128f-49c9-b02a-ea2006d7761e-config-volume\") pod \"collect-profiles-29329020-z5jnc\" (UID: \"698d3347-128f-49c9-b02a-ea2006d7761e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.385918 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/698d3347-128f-49c9-b02a-ea2006d7761e-secret-volume\") pod \"collect-profiles-29329020-z5jnc\" (UID: \"698d3347-128f-49c9-b02a-ea2006d7761e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.386021 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbgxv\" (UniqueName: \"kubernetes.io/projected/698d3347-128f-49c9-b02a-ea2006d7761e-kube-api-access-tbgxv\") pod \"collect-profiles-29329020-z5jnc\" (UID: \"698d3347-128f-49c9-b02a-ea2006d7761e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.386813 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/698d3347-128f-49c9-b02a-ea2006d7761e-config-volume\") pod \"collect-profiles-29329020-z5jnc\" (UID: \"698d3347-128f-49c9-b02a-ea2006d7761e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.391695 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/698d3347-128f-49c9-b02a-ea2006d7761e-secret-volume\") pod \"collect-profiles-29329020-z5jnc\" (UID: \"698d3347-128f-49c9-b02a-ea2006d7761e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.405019 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbgxv\" (UniqueName: \"kubernetes.io/projected/698d3347-128f-49c9-b02a-ea2006d7761e-kube-api-access-tbgxv\") pod \"collect-profiles-29329020-z5jnc\" (UID: \"698d3347-128f-49c9-b02a-ea2006d7761e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.493391 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:00 crc kubenswrapper[4991]: I1006 09:00:00.891029 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc"] Oct 06 09:00:01 crc kubenswrapper[4991]: I1006 09:00:01.771829 4991 generic.go:334] "Generic (PLEG): container finished" podID="698d3347-128f-49c9-b02a-ea2006d7761e" containerID="f587762db42743e1fcaa05f530639cd76da875e860962916a5cdd603b1e50ea7" exitCode=0 Oct 06 09:00:01 crc kubenswrapper[4991]: I1006 09:00:01.771888 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" event={"ID":"698d3347-128f-49c9-b02a-ea2006d7761e","Type":"ContainerDied","Data":"f587762db42743e1fcaa05f530639cd76da875e860962916a5cdd603b1e50ea7"} Oct 06 09:00:01 crc kubenswrapper[4991]: I1006 09:00:01.772126 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" event={"ID":"698d3347-128f-49c9-b02a-ea2006d7761e","Type":"ContainerStarted","Data":"e24cd744a153d2aa30f36a54f2dcfdb687754a910455732f68a12bcc407ce962"} Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.023115 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.222100 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/698d3347-128f-49c9-b02a-ea2006d7761e-secret-volume\") pod \"698d3347-128f-49c9-b02a-ea2006d7761e\" (UID: \"698d3347-128f-49c9-b02a-ea2006d7761e\") " Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.222171 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbgxv\" (UniqueName: \"kubernetes.io/projected/698d3347-128f-49c9-b02a-ea2006d7761e-kube-api-access-tbgxv\") pod \"698d3347-128f-49c9-b02a-ea2006d7761e\" (UID: \"698d3347-128f-49c9-b02a-ea2006d7761e\") " Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.222368 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/698d3347-128f-49c9-b02a-ea2006d7761e-config-volume\") pod \"698d3347-128f-49c9-b02a-ea2006d7761e\" (UID: \"698d3347-128f-49c9-b02a-ea2006d7761e\") " Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.223040 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/698d3347-128f-49c9-b02a-ea2006d7761e-config-volume" (OuterVolumeSpecName: "config-volume") pod "698d3347-128f-49c9-b02a-ea2006d7761e" (UID: "698d3347-128f-49c9-b02a-ea2006d7761e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.223233 4991 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/698d3347-128f-49c9-b02a-ea2006d7761e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.227891 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/698d3347-128f-49c9-b02a-ea2006d7761e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "698d3347-128f-49c9-b02a-ea2006d7761e" (UID: "698d3347-128f-49c9-b02a-ea2006d7761e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.235018 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/698d3347-128f-49c9-b02a-ea2006d7761e-kube-api-access-tbgxv" (OuterVolumeSpecName: "kube-api-access-tbgxv") pod "698d3347-128f-49c9-b02a-ea2006d7761e" (UID: "698d3347-128f-49c9-b02a-ea2006d7761e"). InnerVolumeSpecName "kube-api-access-tbgxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.324756 4991 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/698d3347-128f-49c9-b02a-ea2006d7761e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.324996 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbgxv\" (UniqueName: \"kubernetes.io/projected/698d3347-128f-49c9-b02a-ea2006d7761e-kube-api-access-tbgxv\") on node \"crc\" DevicePath \"\"" Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.788141 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" event={"ID":"698d3347-128f-49c9-b02a-ea2006d7761e","Type":"ContainerDied","Data":"e24cd744a153d2aa30f36a54f2dcfdb687754a910455732f68a12bcc407ce962"} Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.788177 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e24cd744a153d2aa30f36a54f2dcfdb687754a910455732f68a12bcc407ce962" Oct 06 09:00:03 crc kubenswrapper[4991]: I1006 09:00:03.788222 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329020-z5jnc" Oct 06 09:00:04 crc kubenswrapper[4991]: I1006 09:00:04.091374 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl"] Oct 06 09:00:04 crc kubenswrapper[4991]: I1006 09:00:04.098244 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29328975-9xcfl"] Oct 06 09:00:05 crc kubenswrapper[4991]: I1006 09:00:05.255360 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aab780d-af84-45fa-bc9c-b728d4e196d1" path="/var/lib/kubelet/pods/1aab780d-af84-45fa-bc9c-b728d4e196d1/volumes" Oct 06 09:00:09 crc kubenswrapper[4991]: I1006 09:00:09.249419 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 09:00:09 crc kubenswrapper[4991]: E1006 09:00:09.250152 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:00:23 crc kubenswrapper[4991]: I1006 09:00:23.244401 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 09:00:23 crc kubenswrapper[4991]: E1006 09:00:23.245085 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:00:31 crc kubenswrapper[4991]: I1006 09:00:31.648612 4991 scope.go:117] "RemoveContainer" containerID="ec5697ed95fa543f6e1d7395b7ab21c28a99c64c2fee9d4f14b1e4db062368d8" Oct 06 09:00:36 crc kubenswrapper[4991]: I1006 09:00:36.244240 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 09:00:37 crc kubenswrapper[4991]: I1006 09:00:37.071743 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"4c7840836bb8b30722fb37ef42a6ddb91588c34148d5ca4e7091454a235364ab"} Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.146514 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6wszv"] Oct 06 09:02:57 crc kubenswrapper[4991]: E1006 09:02:57.147757 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="698d3347-128f-49c9-b02a-ea2006d7761e" containerName="collect-profiles" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.147772 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="698d3347-128f-49c9-b02a-ea2006d7761e" containerName="collect-profiles" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.147973 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="698d3347-128f-49c9-b02a-ea2006d7761e" containerName="collect-profiles" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.149388 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.169177 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6wszv"] Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.329896 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-utilities\") pod \"certified-operators-6wszv\" (UID: \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\") " pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.329994 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-catalog-content\") pod \"certified-operators-6wszv\" (UID: \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\") " pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.330120 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwjjz\" (UniqueName: \"kubernetes.io/projected/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-kube-api-access-mwjjz\") pod \"certified-operators-6wszv\" (UID: \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\") " pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.431368 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwjjz\" (UniqueName: \"kubernetes.io/projected/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-kube-api-access-mwjjz\") pod \"certified-operators-6wszv\" (UID: \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\") " pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.431770 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-utilities\") pod \"certified-operators-6wszv\" (UID: \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\") " pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.431885 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-catalog-content\") pod \"certified-operators-6wszv\" (UID: \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\") " pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.432313 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-utilities\") pod \"certified-operators-6wszv\" (UID: \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\") " pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.432344 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-catalog-content\") pod \"certified-operators-6wszv\" (UID: \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\") " pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.463635 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwjjz\" (UniqueName: \"kubernetes.io/projected/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-kube-api-access-mwjjz\") pod \"certified-operators-6wszv\" (UID: \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\") " pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.487267 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.529444 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.529503 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:02:57 crc kubenswrapper[4991]: I1006 09:02:57.982609 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6wszv"] Oct 06 09:02:58 crc kubenswrapper[4991]: I1006 09:02:58.321523 4991 generic.go:334] "Generic (PLEG): container finished" podID="e7ce1eb1-bfe4-4493-922e-f2b8f6670096" containerID="b8b69ef62877de9a493a2720f62f97f2afd2a4f0394d77ca07707857d3029cac" exitCode=0 Oct 06 09:02:58 crc kubenswrapper[4991]: I1006 09:02:58.321585 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wszv" event={"ID":"e7ce1eb1-bfe4-4493-922e-f2b8f6670096","Type":"ContainerDied","Data":"b8b69ef62877de9a493a2720f62f97f2afd2a4f0394d77ca07707857d3029cac"} Oct 06 09:02:58 crc kubenswrapper[4991]: I1006 09:02:58.321646 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wszv" event={"ID":"e7ce1eb1-bfe4-4493-922e-f2b8f6670096","Type":"ContainerStarted","Data":"bb6c43158273ff023e6c7a36dc9601ed64b6d98526819ee7414aa8c79ef3da40"} Oct 06 09:02:58 crc kubenswrapper[4991]: I1006 09:02:58.323364 4991 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 09:02:59 crc kubenswrapper[4991]: I1006 09:02:59.332029 4991 generic.go:334] "Generic (PLEG): container finished" podID="e7ce1eb1-bfe4-4493-922e-f2b8f6670096" containerID="8cf14467f6673fff904c8c1158d96c0e341552e94590ccc31575abfcbbe00d53" exitCode=0 Oct 06 09:02:59 crc kubenswrapper[4991]: I1006 09:02:59.332194 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wszv" event={"ID":"e7ce1eb1-bfe4-4493-922e-f2b8f6670096","Type":"ContainerDied","Data":"8cf14467f6673fff904c8c1158d96c0e341552e94590ccc31575abfcbbe00d53"} Oct 06 09:03:00 crc kubenswrapper[4991]: I1006 09:03:00.343376 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wszv" event={"ID":"e7ce1eb1-bfe4-4493-922e-f2b8f6670096","Type":"ContainerStarted","Data":"2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560"} Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.313359 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6wszv" podStartSLOduration=4.818359876 podStartE2EDuration="6.313325349s" podCreationTimestamp="2025-10-06 09:02:57 +0000 UTC" firstStartedPulling="2025-10-06 09:02:58.323024781 +0000 UTC m=+2630.060774822" lastFinishedPulling="2025-10-06 09:02:59.817990264 +0000 UTC m=+2631.555740295" observedRunningTime="2025-10-06 09:03:00.369634768 +0000 UTC m=+2632.107384789" watchObservedRunningTime="2025-10-06 09:03:03.313325349 +0000 UTC m=+2635.051075390" Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.317689 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qwt6k"] Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.319930 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.331056 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qwt6k"] Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.521785 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg4vg\" (UniqueName: \"kubernetes.io/projected/9a502684-358c-40cf-8dda-1f6039d4bcff-kube-api-access-cg4vg\") pod \"redhat-marketplace-qwt6k\" (UID: \"9a502684-358c-40cf-8dda-1f6039d4bcff\") " pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.521888 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a502684-358c-40cf-8dda-1f6039d4bcff-utilities\") pod \"redhat-marketplace-qwt6k\" (UID: \"9a502684-358c-40cf-8dda-1f6039d4bcff\") " pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.522327 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a502684-358c-40cf-8dda-1f6039d4bcff-catalog-content\") pod \"redhat-marketplace-qwt6k\" (UID: \"9a502684-358c-40cf-8dda-1f6039d4bcff\") " pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.623909 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg4vg\" (UniqueName: \"kubernetes.io/projected/9a502684-358c-40cf-8dda-1f6039d4bcff-kube-api-access-cg4vg\") pod \"redhat-marketplace-qwt6k\" (UID: \"9a502684-358c-40cf-8dda-1f6039d4bcff\") " pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.624235 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a502684-358c-40cf-8dda-1f6039d4bcff-utilities\") pod \"redhat-marketplace-qwt6k\" (UID: \"9a502684-358c-40cf-8dda-1f6039d4bcff\") " pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.624377 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a502684-358c-40cf-8dda-1f6039d4bcff-catalog-content\") pod \"redhat-marketplace-qwt6k\" (UID: \"9a502684-358c-40cf-8dda-1f6039d4bcff\") " pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.624801 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a502684-358c-40cf-8dda-1f6039d4bcff-utilities\") pod \"redhat-marketplace-qwt6k\" (UID: \"9a502684-358c-40cf-8dda-1f6039d4bcff\") " pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.624900 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a502684-358c-40cf-8dda-1f6039d4bcff-catalog-content\") pod \"redhat-marketplace-qwt6k\" (UID: \"9a502684-358c-40cf-8dda-1f6039d4bcff\") " pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.650537 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg4vg\" (UniqueName: \"kubernetes.io/projected/9a502684-358c-40cf-8dda-1f6039d4bcff-kube-api-access-cg4vg\") pod \"redhat-marketplace-qwt6k\" (UID: \"9a502684-358c-40cf-8dda-1f6039d4bcff\") " pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:03 crc kubenswrapper[4991]: I1006 09:03:03.948640 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:04 crc kubenswrapper[4991]: I1006 09:03:04.205380 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qwt6k"] Oct 06 09:03:04 crc kubenswrapper[4991]: I1006 09:03:04.375438 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwt6k" event={"ID":"9a502684-358c-40cf-8dda-1f6039d4bcff","Type":"ContainerStarted","Data":"b391101e3a2a94aa59dcf7b672b794196aaf2156ce5f2aa797a7cec907eb1c19"} Oct 06 09:03:05 crc kubenswrapper[4991]: I1006 09:03:05.388907 4991 generic.go:334] "Generic (PLEG): container finished" podID="9a502684-358c-40cf-8dda-1f6039d4bcff" containerID="008faa630ac6c7dc050f77280beb723f60d95b9a9ada39c6ea9a319c7c00835b" exitCode=0 Oct 06 09:03:05 crc kubenswrapper[4991]: I1006 09:03:05.389070 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwt6k" event={"ID":"9a502684-358c-40cf-8dda-1f6039d4bcff","Type":"ContainerDied","Data":"008faa630ac6c7dc050f77280beb723f60d95b9a9ada39c6ea9a319c7c00835b"} Oct 06 09:03:06 crc kubenswrapper[4991]: I1006 09:03:06.398393 4991 generic.go:334] "Generic (PLEG): container finished" podID="9a502684-358c-40cf-8dda-1f6039d4bcff" containerID="80a8f6139dac9b2d43ec4914101b9f81cb2e99a0aab14e0fb28ef57bf4dd015e" exitCode=0 Oct 06 09:03:06 crc kubenswrapper[4991]: I1006 09:03:06.398499 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwt6k" event={"ID":"9a502684-358c-40cf-8dda-1f6039d4bcff","Type":"ContainerDied","Data":"80a8f6139dac9b2d43ec4914101b9f81cb2e99a0aab14e0fb28ef57bf4dd015e"} Oct 06 09:03:07 crc kubenswrapper[4991]: I1006 09:03:07.410031 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwt6k" event={"ID":"9a502684-358c-40cf-8dda-1f6039d4bcff","Type":"ContainerStarted","Data":"b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c"} Oct 06 09:03:07 crc kubenswrapper[4991]: I1006 09:03:07.435000 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qwt6k" podStartSLOduration=2.79892245 podStartE2EDuration="4.434980144s" podCreationTimestamp="2025-10-06 09:03:03 +0000 UTC" firstStartedPulling="2025-10-06 09:03:05.392426809 +0000 UTC m=+2637.130176830" lastFinishedPulling="2025-10-06 09:03:07.028484513 +0000 UTC m=+2638.766234524" observedRunningTime="2025-10-06 09:03:07.42937138 +0000 UTC m=+2639.167121411" watchObservedRunningTime="2025-10-06 09:03:07.434980144 +0000 UTC m=+2639.172730165" Oct 06 09:03:07 crc kubenswrapper[4991]: I1006 09:03:07.488424 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:03:07 crc kubenswrapper[4991]: I1006 09:03:07.488492 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:03:07 crc kubenswrapper[4991]: I1006 09:03:07.542890 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:03:08 crc kubenswrapper[4991]: I1006 09:03:08.474493 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:03:09 crc kubenswrapper[4991]: I1006 09:03:09.711572 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6wszv"] Oct 06 09:03:10 crc kubenswrapper[4991]: I1006 09:03:10.432838 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6wszv" podUID="e7ce1eb1-bfe4-4493-922e-f2b8f6670096" containerName="registry-server" containerID="cri-o://2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560" gracePeriod=2 Oct 06 09:03:10 crc kubenswrapper[4991]: I1006 09:03:10.840444 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:03:10 crc kubenswrapper[4991]: I1006 09:03:10.956829 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-utilities\") pod \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\" (UID: \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\") " Oct 06 09:03:10 crc kubenswrapper[4991]: I1006 09:03:10.956913 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwjjz\" (UniqueName: \"kubernetes.io/projected/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-kube-api-access-mwjjz\") pod \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\" (UID: \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\") " Oct 06 09:03:10 crc kubenswrapper[4991]: I1006 09:03:10.957245 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-catalog-content\") pod \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\" (UID: \"e7ce1eb1-bfe4-4493-922e-f2b8f6670096\") " Oct 06 09:03:10 crc kubenswrapper[4991]: I1006 09:03:10.958075 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-utilities" (OuterVolumeSpecName: "utilities") pod "e7ce1eb1-bfe4-4493-922e-f2b8f6670096" (UID: "e7ce1eb1-bfe4-4493-922e-f2b8f6670096"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:10 crc kubenswrapper[4991]: I1006 09:03:10.966363 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-kube-api-access-mwjjz" (OuterVolumeSpecName: "kube-api-access-mwjjz") pod "e7ce1eb1-bfe4-4493-922e-f2b8f6670096" (UID: "e7ce1eb1-bfe4-4493-922e-f2b8f6670096"). InnerVolumeSpecName "kube-api-access-mwjjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.018392 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7ce1eb1-bfe4-4493-922e-f2b8f6670096" (UID: "e7ce1eb1-bfe4-4493-922e-f2b8f6670096"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.059642 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.059698 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.059710 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwjjz\" (UniqueName: \"kubernetes.io/projected/e7ce1eb1-bfe4-4493-922e-f2b8f6670096-kube-api-access-mwjjz\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.441829 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wszv" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.441861 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wszv" event={"ID":"e7ce1eb1-bfe4-4493-922e-f2b8f6670096","Type":"ContainerDied","Data":"2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560"} Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.442217 4991 scope.go:117] "RemoveContainer" containerID="2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.441717 4991 generic.go:334] "Generic (PLEG): container finished" podID="e7ce1eb1-bfe4-4493-922e-f2b8f6670096" containerID="2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560" exitCode=0 Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.442709 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wszv" event={"ID":"e7ce1eb1-bfe4-4493-922e-f2b8f6670096","Type":"ContainerDied","Data":"bb6c43158273ff023e6c7a36dc9601ed64b6d98526819ee7414aa8c79ef3da40"} Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.460653 4991 scope.go:117] "RemoveContainer" containerID="8cf14467f6673fff904c8c1158d96c0e341552e94590ccc31575abfcbbe00d53" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.470568 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6wszv"] Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.483494 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6wszv"] Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.493766 4991 scope.go:117] "RemoveContainer" containerID="b8b69ef62877de9a493a2720f62f97f2afd2a4f0394d77ca07707857d3029cac" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.540236 4991 scope.go:117] "RemoveContainer" containerID="2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560" Oct 06 09:03:11 crc kubenswrapper[4991]: E1006 09:03:11.540760 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560\": container with ID starting with 2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560 not found: ID does not exist" containerID="2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.540805 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560"} err="failed to get container status \"2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560\": rpc error: code = NotFound desc = could not find container \"2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560\": container with ID starting with 2ccfbbddf5936827e37662e8ab726cb0fc635189c1544cf36cc22add9c66c560 not found: ID does not exist" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.540834 4991 scope.go:117] "RemoveContainer" containerID="8cf14467f6673fff904c8c1158d96c0e341552e94590ccc31575abfcbbe00d53" Oct 06 09:03:11 crc kubenswrapper[4991]: E1006 09:03:11.541332 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf14467f6673fff904c8c1158d96c0e341552e94590ccc31575abfcbbe00d53\": container with ID starting with 8cf14467f6673fff904c8c1158d96c0e341552e94590ccc31575abfcbbe00d53 not found: ID does not exist" containerID="8cf14467f6673fff904c8c1158d96c0e341552e94590ccc31575abfcbbe00d53" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.541363 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf14467f6673fff904c8c1158d96c0e341552e94590ccc31575abfcbbe00d53"} err="failed to get container status \"8cf14467f6673fff904c8c1158d96c0e341552e94590ccc31575abfcbbe00d53\": rpc error: code = NotFound desc = could not find container \"8cf14467f6673fff904c8c1158d96c0e341552e94590ccc31575abfcbbe00d53\": container with ID starting with 8cf14467f6673fff904c8c1158d96c0e341552e94590ccc31575abfcbbe00d53 not found: ID does not exist" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.541384 4991 scope.go:117] "RemoveContainer" containerID="b8b69ef62877de9a493a2720f62f97f2afd2a4f0394d77ca07707857d3029cac" Oct 06 09:03:11 crc kubenswrapper[4991]: E1006 09:03:11.541731 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8b69ef62877de9a493a2720f62f97f2afd2a4f0394d77ca07707857d3029cac\": container with ID starting with b8b69ef62877de9a493a2720f62f97f2afd2a4f0394d77ca07707857d3029cac not found: ID does not exist" containerID="b8b69ef62877de9a493a2720f62f97f2afd2a4f0394d77ca07707857d3029cac" Oct 06 09:03:11 crc kubenswrapper[4991]: I1006 09:03:11.541755 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8b69ef62877de9a493a2720f62f97f2afd2a4f0394d77ca07707857d3029cac"} err="failed to get container status \"b8b69ef62877de9a493a2720f62f97f2afd2a4f0394d77ca07707857d3029cac\": rpc error: code = NotFound desc = could not find container \"b8b69ef62877de9a493a2720f62f97f2afd2a4f0394d77ca07707857d3029cac\": container with ID starting with b8b69ef62877de9a493a2720f62f97f2afd2a4f0394d77ca07707857d3029cac not found: ID does not exist" Oct 06 09:03:13 crc kubenswrapper[4991]: I1006 09:03:13.255428 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7ce1eb1-bfe4-4493-922e-f2b8f6670096" path="/var/lib/kubelet/pods/e7ce1eb1-bfe4-4493-922e-f2b8f6670096/volumes" Oct 06 09:03:13 crc kubenswrapper[4991]: I1006 09:03:13.948896 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:13 crc kubenswrapper[4991]: I1006 09:03:13.949633 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:13 crc kubenswrapper[4991]: I1006 09:03:13.997118 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:14 crc kubenswrapper[4991]: I1006 09:03:14.514229 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:15 crc kubenswrapper[4991]: I1006 09:03:15.102285 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qwt6k"] Oct 06 09:03:16 crc kubenswrapper[4991]: I1006 09:03:16.482439 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qwt6k" podUID="9a502684-358c-40cf-8dda-1f6039d4bcff" containerName="registry-server" containerID="cri-o://b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c" gracePeriod=2 Oct 06 09:03:16 crc kubenswrapper[4991]: I1006 09:03:16.919925 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.041429 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a502684-358c-40cf-8dda-1f6039d4bcff-utilities\") pod \"9a502684-358c-40cf-8dda-1f6039d4bcff\" (UID: \"9a502684-358c-40cf-8dda-1f6039d4bcff\") " Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.041553 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a502684-358c-40cf-8dda-1f6039d4bcff-catalog-content\") pod \"9a502684-358c-40cf-8dda-1f6039d4bcff\" (UID: \"9a502684-358c-40cf-8dda-1f6039d4bcff\") " Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.041652 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg4vg\" (UniqueName: \"kubernetes.io/projected/9a502684-358c-40cf-8dda-1f6039d4bcff-kube-api-access-cg4vg\") pod \"9a502684-358c-40cf-8dda-1f6039d4bcff\" (UID: \"9a502684-358c-40cf-8dda-1f6039d4bcff\") " Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.042886 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a502684-358c-40cf-8dda-1f6039d4bcff-utilities" (OuterVolumeSpecName: "utilities") pod "9a502684-358c-40cf-8dda-1f6039d4bcff" (UID: "9a502684-358c-40cf-8dda-1f6039d4bcff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.049558 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a502684-358c-40cf-8dda-1f6039d4bcff-kube-api-access-cg4vg" (OuterVolumeSpecName: "kube-api-access-cg4vg") pod "9a502684-358c-40cf-8dda-1f6039d4bcff" (UID: "9a502684-358c-40cf-8dda-1f6039d4bcff"). InnerVolumeSpecName "kube-api-access-cg4vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.055059 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a502684-358c-40cf-8dda-1f6039d4bcff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a502684-358c-40cf-8dda-1f6039d4bcff" (UID: "9a502684-358c-40cf-8dda-1f6039d4bcff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.143876 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a502684-358c-40cf-8dda-1f6039d4bcff-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.143912 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a502684-358c-40cf-8dda-1f6039d4bcff-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.143925 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg4vg\" (UniqueName: \"kubernetes.io/projected/9a502684-358c-40cf-8dda-1f6039d4bcff-kube-api-access-cg4vg\") on node \"crc\" DevicePath \"\"" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.492221 4991 generic.go:334] "Generic (PLEG): container finished" podID="9a502684-358c-40cf-8dda-1f6039d4bcff" containerID="b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c" exitCode=0 Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.492287 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwt6k" event={"ID":"9a502684-358c-40cf-8dda-1f6039d4bcff","Type":"ContainerDied","Data":"b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c"} Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.492355 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qwt6k" event={"ID":"9a502684-358c-40cf-8dda-1f6039d4bcff","Type":"ContainerDied","Data":"b391101e3a2a94aa59dcf7b672b794196aaf2156ce5f2aa797a7cec907eb1c19"} Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.492384 4991 scope.go:117] "RemoveContainer" containerID="b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.492579 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qwt6k" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.518012 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qwt6k"] Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.522524 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qwt6k"] Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.529321 4991 scope.go:117] "RemoveContainer" containerID="80a8f6139dac9b2d43ec4914101b9f81cb2e99a0aab14e0fb28ef57bf4dd015e" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.547537 4991 scope.go:117] "RemoveContainer" containerID="008faa630ac6c7dc050f77280beb723f60d95b9a9ada39c6ea9a319c7c00835b" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.580722 4991 scope.go:117] "RemoveContainer" containerID="b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c" Oct 06 09:03:17 crc kubenswrapper[4991]: E1006 09:03:17.581658 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c\": container with ID starting with b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c not found: ID does not exist" containerID="b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.581713 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c"} err="failed to get container status \"b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c\": rpc error: code = NotFound desc = could not find container \"b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c\": container with ID starting with b9a265c199545375e7479114d1c4e97f9cf30064f70ca5d957ee71df3c0fb41c not found: ID does not exist" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.581751 4991 scope.go:117] "RemoveContainer" containerID="80a8f6139dac9b2d43ec4914101b9f81cb2e99a0aab14e0fb28ef57bf4dd015e" Oct 06 09:03:17 crc kubenswrapper[4991]: E1006 09:03:17.582281 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80a8f6139dac9b2d43ec4914101b9f81cb2e99a0aab14e0fb28ef57bf4dd015e\": container with ID starting with 80a8f6139dac9b2d43ec4914101b9f81cb2e99a0aab14e0fb28ef57bf4dd015e not found: ID does not exist" containerID="80a8f6139dac9b2d43ec4914101b9f81cb2e99a0aab14e0fb28ef57bf4dd015e" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.582334 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80a8f6139dac9b2d43ec4914101b9f81cb2e99a0aab14e0fb28ef57bf4dd015e"} err="failed to get container status \"80a8f6139dac9b2d43ec4914101b9f81cb2e99a0aab14e0fb28ef57bf4dd015e\": rpc error: code = NotFound desc = could not find container \"80a8f6139dac9b2d43ec4914101b9f81cb2e99a0aab14e0fb28ef57bf4dd015e\": container with ID starting with 80a8f6139dac9b2d43ec4914101b9f81cb2e99a0aab14e0fb28ef57bf4dd015e not found: ID does not exist" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.582358 4991 scope.go:117] "RemoveContainer" containerID="008faa630ac6c7dc050f77280beb723f60d95b9a9ada39c6ea9a319c7c00835b" Oct 06 09:03:17 crc kubenswrapper[4991]: E1006 09:03:17.582941 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008faa630ac6c7dc050f77280beb723f60d95b9a9ada39c6ea9a319c7c00835b\": container with ID starting with 008faa630ac6c7dc050f77280beb723f60d95b9a9ada39c6ea9a319c7c00835b not found: ID does not exist" containerID="008faa630ac6c7dc050f77280beb723f60d95b9a9ada39c6ea9a319c7c00835b" Oct 06 09:03:17 crc kubenswrapper[4991]: I1006 09:03:17.582974 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008faa630ac6c7dc050f77280beb723f60d95b9a9ada39c6ea9a319c7c00835b"} err="failed to get container status \"008faa630ac6c7dc050f77280beb723f60d95b9a9ada39c6ea9a319c7c00835b\": rpc error: code = NotFound desc = could not find container \"008faa630ac6c7dc050f77280beb723f60d95b9a9ada39c6ea9a319c7c00835b\": container with ID starting with 008faa630ac6c7dc050f77280beb723f60d95b9a9ada39c6ea9a319c7c00835b not found: ID does not exist" Oct 06 09:03:19 crc kubenswrapper[4991]: I1006 09:03:19.258537 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a502684-358c-40cf-8dda-1f6039d4bcff" path="/var/lib/kubelet/pods/9a502684-358c-40cf-8dda-1f6039d4bcff/volumes" Oct 06 09:03:27 crc kubenswrapper[4991]: I1006 09:03:27.529024 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:03:27 crc kubenswrapper[4991]: I1006 09:03:27.529955 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:03:57 crc kubenswrapper[4991]: I1006 09:03:57.529415 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:03:57 crc kubenswrapper[4991]: I1006 09:03:57.530321 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:03:57 crc kubenswrapper[4991]: I1006 09:03:57.530386 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 09:03:57 crc kubenswrapper[4991]: I1006 09:03:57.531597 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c7840836bb8b30722fb37ef42a6ddb91588c34148d5ca4e7091454a235364ab"} pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:03:57 crc kubenswrapper[4991]: I1006 09:03:57.531677 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" containerID="cri-o://4c7840836bb8b30722fb37ef42a6ddb91588c34148d5ca4e7091454a235364ab" gracePeriod=600 Oct 06 09:03:57 crc kubenswrapper[4991]: I1006 09:03:57.828876 4991 generic.go:334] "Generic (PLEG): container finished" podID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerID="4c7840836bb8b30722fb37ef42a6ddb91588c34148d5ca4e7091454a235364ab" exitCode=0 Oct 06 09:03:57 crc kubenswrapper[4991]: I1006 09:03:57.829006 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerDied","Data":"4c7840836bb8b30722fb37ef42a6ddb91588c34148d5ca4e7091454a235364ab"} Oct 06 09:03:57 crc kubenswrapper[4991]: I1006 09:03:57.830596 4991 scope.go:117] "RemoveContainer" containerID="5525bdc2f16a2f8896fed67fb0ade5ec104951cf036ffc506efe33c110b8fdcb" Oct 06 09:03:58 crc kubenswrapper[4991]: I1006 09:03:58.839939 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerStarted","Data":"6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c"} Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.673848 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mr4dh"] Oct 06 09:06:08 crc kubenswrapper[4991]: E1006 09:06:08.674724 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ce1eb1-bfe4-4493-922e-f2b8f6670096" containerName="extract-utilities" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.674738 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ce1eb1-bfe4-4493-922e-f2b8f6670096" containerName="extract-utilities" Oct 06 09:06:08 crc kubenswrapper[4991]: E1006 09:06:08.674752 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ce1eb1-bfe4-4493-922e-f2b8f6670096" containerName="registry-server" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.674758 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ce1eb1-bfe4-4493-922e-f2b8f6670096" containerName="registry-server" Oct 06 09:06:08 crc kubenswrapper[4991]: E1006 09:06:08.674770 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a502684-358c-40cf-8dda-1f6039d4bcff" containerName="extract-content" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.674776 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a502684-358c-40cf-8dda-1f6039d4bcff" containerName="extract-content" Oct 06 09:06:08 crc kubenswrapper[4991]: E1006 09:06:08.674790 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a502684-358c-40cf-8dda-1f6039d4bcff" containerName="extract-utilities" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.674795 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a502684-358c-40cf-8dda-1f6039d4bcff" containerName="extract-utilities" Oct 06 09:06:08 crc kubenswrapper[4991]: E1006 09:06:08.674803 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ce1eb1-bfe4-4493-922e-f2b8f6670096" containerName="extract-content" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.674809 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ce1eb1-bfe4-4493-922e-f2b8f6670096" containerName="extract-content" Oct 06 09:06:08 crc kubenswrapper[4991]: E1006 09:06:08.674825 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a502684-358c-40cf-8dda-1f6039d4bcff" containerName="registry-server" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.674830 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a502684-358c-40cf-8dda-1f6039d4bcff" containerName="registry-server" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.674962 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a502684-358c-40cf-8dda-1f6039d4bcff" containerName="registry-server" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.674980 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7ce1eb1-bfe4-4493-922e-f2b8f6670096" containerName="registry-server" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.676062 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.699383 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mr4dh"] Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.826982 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80c1a6aa-770b-4250-8d43-cd951f73a1c7-catalog-content\") pod \"community-operators-mr4dh\" (UID: \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\") " pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.827341 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv8v5\" (UniqueName: \"kubernetes.io/projected/80c1a6aa-770b-4250-8d43-cd951f73a1c7-kube-api-access-rv8v5\") pod \"community-operators-mr4dh\" (UID: \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\") " pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.827505 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80c1a6aa-770b-4250-8d43-cd951f73a1c7-utilities\") pod \"community-operators-mr4dh\" (UID: \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\") " pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.929402 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv8v5\" (UniqueName: \"kubernetes.io/projected/80c1a6aa-770b-4250-8d43-cd951f73a1c7-kube-api-access-rv8v5\") pod \"community-operators-mr4dh\" (UID: \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\") " pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.929489 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80c1a6aa-770b-4250-8d43-cd951f73a1c7-utilities\") pod \"community-operators-mr4dh\" (UID: \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\") " pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.929524 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80c1a6aa-770b-4250-8d43-cd951f73a1c7-catalog-content\") pod \"community-operators-mr4dh\" (UID: \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\") " pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.930173 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80c1a6aa-770b-4250-8d43-cd951f73a1c7-catalog-content\") pod \"community-operators-mr4dh\" (UID: \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\") " pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.930459 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80c1a6aa-770b-4250-8d43-cd951f73a1c7-utilities\") pod \"community-operators-mr4dh\" (UID: \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\") " pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:08 crc kubenswrapper[4991]: I1006 09:06:08.953402 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv8v5\" (UniqueName: \"kubernetes.io/projected/80c1a6aa-770b-4250-8d43-cd951f73a1c7-kube-api-access-rv8v5\") pod \"community-operators-mr4dh\" (UID: \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\") " pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:09 crc kubenswrapper[4991]: I1006 09:06:09.016666 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:09 crc kubenswrapper[4991]: I1006 09:06:09.493586 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mr4dh"] Oct 06 09:06:10 crc kubenswrapper[4991]: I1006 09:06:10.029726 4991 generic.go:334] "Generic (PLEG): container finished" podID="80c1a6aa-770b-4250-8d43-cd951f73a1c7" containerID="daf2df4b699c81c2d066bb876203fb00def5b568f6ce224c2315bddaad88b0a6" exitCode=0 Oct 06 09:06:10 crc kubenswrapper[4991]: I1006 09:06:10.029797 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr4dh" event={"ID":"80c1a6aa-770b-4250-8d43-cd951f73a1c7","Type":"ContainerDied","Data":"daf2df4b699c81c2d066bb876203fb00def5b568f6ce224c2315bddaad88b0a6"} Oct 06 09:06:10 crc kubenswrapper[4991]: I1006 09:06:10.030996 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr4dh" event={"ID":"80c1a6aa-770b-4250-8d43-cd951f73a1c7","Type":"ContainerStarted","Data":"772bcab2bf2a2ea65ad267158410e72a3307e951b2fd347a0650f9cde69d527c"} Oct 06 09:06:11 crc kubenswrapper[4991]: I1006 09:06:11.041180 4991 generic.go:334] "Generic (PLEG): container finished" podID="80c1a6aa-770b-4250-8d43-cd951f73a1c7" containerID="3045547e4a3eb702b5d3fe301c1784b6cdfec591d768d791a9c66e945bc89d55" exitCode=0 Oct 06 09:06:11 crc kubenswrapper[4991]: I1006 09:06:11.041312 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr4dh" event={"ID":"80c1a6aa-770b-4250-8d43-cd951f73a1c7","Type":"ContainerDied","Data":"3045547e4a3eb702b5d3fe301c1784b6cdfec591d768d791a9c66e945bc89d55"} Oct 06 09:06:12 crc kubenswrapper[4991]: I1006 09:06:12.068785 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr4dh" event={"ID":"80c1a6aa-770b-4250-8d43-cd951f73a1c7","Type":"ContainerStarted","Data":"0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c"} Oct 06 09:06:12 crc kubenswrapper[4991]: I1006 09:06:12.099635 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mr4dh" podStartSLOduration=2.697204869 podStartE2EDuration="4.099614405s" podCreationTimestamp="2025-10-06 09:06:08 +0000 UTC" firstStartedPulling="2025-10-06 09:06:10.032383551 +0000 UTC m=+2821.770133612" lastFinishedPulling="2025-10-06 09:06:11.434793077 +0000 UTC m=+2823.172543148" observedRunningTime="2025-10-06 09:06:12.094050291 +0000 UTC m=+2823.831800332" watchObservedRunningTime="2025-10-06 09:06:12.099614405 +0000 UTC m=+2823.837364426" Oct 06 09:06:19 crc kubenswrapper[4991]: I1006 09:06:19.016814 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:19 crc kubenswrapper[4991]: I1006 09:06:19.017468 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:19 crc kubenswrapper[4991]: I1006 09:06:19.084674 4991 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:19 crc kubenswrapper[4991]: I1006 09:06:19.185510 4991 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:19 crc kubenswrapper[4991]: I1006 09:06:19.330547 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mr4dh"] Oct 06 09:06:21 crc kubenswrapper[4991]: I1006 09:06:21.148949 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mr4dh" podUID="80c1a6aa-770b-4250-8d43-cd951f73a1c7" containerName="registry-server" containerID="cri-o://0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c" gracePeriod=2 Oct 06 09:06:21 crc kubenswrapper[4991]: I1006 09:06:21.617225 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:21 crc kubenswrapper[4991]: I1006 09:06:21.748287 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80c1a6aa-770b-4250-8d43-cd951f73a1c7-utilities\") pod \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\" (UID: \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\") " Oct 06 09:06:21 crc kubenswrapper[4991]: I1006 09:06:21.748500 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80c1a6aa-770b-4250-8d43-cd951f73a1c7-catalog-content\") pod \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\" (UID: \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\") " Oct 06 09:06:21 crc kubenswrapper[4991]: I1006 09:06:21.748763 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv8v5\" (UniqueName: \"kubernetes.io/projected/80c1a6aa-770b-4250-8d43-cd951f73a1c7-kube-api-access-rv8v5\") pod \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\" (UID: \"80c1a6aa-770b-4250-8d43-cd951f73a1c7\") " Oct 06 09:06:21 crc kubenswrapper[4991]: I1006 09:06:21.752147 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80c1a6aa-770b-4250-8d43-cd951f73a1c7-utilities" (OuterVolumeSpecName: "utilities") pod "80c1a6aa-770b-4250-8d43-cd951f73a1c7" (UID: "80c1a6aa-770b-4250-8d43-cd951f73a1c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:06:21 crc kubenswrapper[4991]: I1006 09:06:21.757949 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80c1a6aa-770b-4250-8d43-cd951f73a1c7-kube-api-access-rv8v5" (OuterVolumeSpecName: "kube-api-access-rv8v5") pod "80c1a6aa-770b-4250-8d43-cd951f73a1c7" (UID: "80c1a6aa-770b-4250-8d43-cd951f73a1c7"). InnerVolumeSpecName "kube-api-access-rv8v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:06:21 crc kubenswrapper[4991]: I1006 09:06:21.845728 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80c1a6aa-770b-4250-8d43-cd951f73a1c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80c1a6aa-770b-4250-8d43-cd951f73a1c7" (UID: "80c1a6aa-770b-4250-8d43-cd951f73a1c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:06:21 crc kubenswrapper[4991]: I1006 09:06:21.850706 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv8v5\" (UniqueName: \"kubernetes.io/projected/80c1a6aa-770b-4250-8d43-cd951f73a1c7-kube-api-access-rv8v5\") on node \"crc\" DevicePath \"\"" Oct 06 09:06:21 crc kubenswrapper[4991]: I1006 09:06:21.851698 4991 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80c1a6aa-770b-4250-8d43-cd951f73a1c7-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 09:06:21 crc kubenswrapper[4991]: I1006 09:06:21.851753 4991 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80c1a6aa-770b-4250-8d43-cd951f73a1c7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.163032 4991 generic.go:334] "Generic (PLEG): container finished" podID="80c1a6aa-770b-4250-8d43-cd951f73a1c7" containerID="0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c" exitCode=0 Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.163109 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr4dh" event={"ID":"80c1a6aa-770b-4250-8d43-cd951f73a1c7","Type":"ContainerDied","Data":"0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c"} Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.163154 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr4dh" event={"ID":"80c1a6aa-770b-4250-8d43-cd951f73a1c7","Type":"ContainerDied","Data":"772bcab2bf2a2ea65ad267158410e72a3307e951b2fd347a0650f9cde69d527c"} Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.163186 4991 scope.go:117] "RemoveContainer" containerID="0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c" Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.163179 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr4dh" Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.195432 4991 scope.go:117] "RemoveContainer" containerID="3045547e4a3eb702b5d3fe301c1784b6cdfec591d768d791a9c66e945bc89d55" Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.215285 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mr4dh"] Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.219951 4991 scope.go:117] "RemoveContainer" containerID="daf2df4b699c81c2d066bb876203fb00def5b568f6ce224c2315bddaad88b0a6" Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.222699 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mr4dh"] Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.265130 4991 scope.go:117] "RemoveContainer" containerID="0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c" Oct 06 09:06:22 crc kubenswrapper[4991]: E1006 09:06:22.265843 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c\": container with ID starting with 0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c not found: ID does not exist" containerID="0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c" Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.265902 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c"} err="failed to get container status \"0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c\": rpc error: code = NotFound desc = could not find container \"0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c\": container with ID starting with 0b1c97927015ca1c09d1f8b9b2f6809c47f3035b5aa50d59eb30bdd0e926c89c not found: ID does not exist" Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.265935 4991 scope.go:117] "RemoveContainer" containerID="3045547e4a3eb702b5d3fe301c1784b6cdfec591d768d791a9c66e945bc89d55" Oct 06 09:06:22 crc kubenswrapper[4991]: E1006 09:06:22.266761 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3045547e4a3eb702b5d3fe301c1784b6cdfec591d768d791a9c66e945bc89d55\": container with ID starting with 3045547e4a3eb702b5d3fe301c1784b6cdfec591d768d791a9c66e945bc89d55 not found: ID does not exist" containerID="3045547e4a3eb702b5d3fe301c1784b6cdfec591d768d791a9c66e945bc89d55" Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.266809 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3045547e4a3eb702b5d3fe301c1784b6cdfec591d768d791a9c66e945bc89d55"} err="failed to get container status \"3045547e4a3eb702b5d3fe301c1784b6cdfec591d768d791a9c66e945bc89d55\": rpc error: code = NotFound desc = could not find container \"3045547e4a3eb702b5d3fe301c1784b6cdfec591d768d791a9c66e945bc89d55\": container with ID starting with 3045547e4a3eb702b5d3fe301c1784b6cdfec591d768d791a9c66e945bc89d55 not found: ID does not exist" Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.266835 4991 scope.go:117] "RemoveContainer" containerID="daf2df4b699c81c2d066bb876203fb00def5b568f6ce224c2315bddaad88b0a6" Oct 06 09:06:22 crc kubenswrapper[4991]: E1006 09:06:22.267527 4991 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daf2df4b699c81c2d066bb876203fb00def5b568f6ce224c2315bddaad88b0a6\": container with ID starting with daf2df4b699c81c2d066bb876203fb00def5b568f6ce224c2315bddaad88b0a6 not found: ID does not exist" containerID="daf2df4b699c81c2d066bb876203fb00def5b568f6ce224c2315bddaad88b0a6" Oct 06 09:06:22 crc kubenswrapper[4991]: I1006 09:06:22.267574 4991 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daf2df4b699c81c2d066bb876203fb00def5b568f6ce224c2315bddaad88b0a6"} err="failed to get container status \"daf2df4b699c81c2d066bb876203fb00def5b568f6ce224c2315bddaad88b0a6\": rpc error: code = NotFound desc = could not find container \"daf2df4b699c81c2d066bb876203fb00def5b568f6ce224c2315bddaad88b0a6\": container with ID starting with daf2df4b699c81c2d066bb876203fb00def5b568f6ce224c2315bddaad88b0a6 not found: ID does not exist" Oct 06 09:06:23 crc kubenswrapper[4991]: I1006 09:06:23.260427 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80c1a6aa-770b-4250-8d43-cd951f73a1c7" path="/var/lib/kubelet/pods/80c1a6aa-770b-4250-8d43-cd951f73a1c7/volumes" Oct 06 09:06:27 crc kubenswrapper[4991]: I1006 09:06:27.529880 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:06:27 crc kubenswrapper[4991]: I1006 09:06:27.530480 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:06:57 crc kubenswrapper[4991]: I1006 09:06:57.528815 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:06:57 crc kubenswrapper[4991]: I1006 09:06:57.529673 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:07:27 crc kubenswrapper[4991]: I1006 09:07:27.529250 4991 patch_prober.go:28] interesting pod/machine-config-daemon-wpb6m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 09:07:27 crc kubenswrapper[4991]: I1006 09:07:27.529975 4991 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 09:07:27 crc kubenswrapper[4991]: I1006 09:07:27.530052 4991 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" Oct 06 09:07:27 crc kubenswrapper[4991]: I1006 09:07:27.531070 4991 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c"} pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 09:07:27 crc kubenswrapper[4991]: I1006 09:07:27.531184 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerName="machine-config-daemon" containerID="cri-o://6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" gracePeriod=600 Oct 06 09:07:27 crc kubenswrapper[4991]: E1006 09:07:27.667534 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:07:27 crc kubenswrapper[4991]: I1006 09:07:27.791226 4991 generic.go:334] "Generic (PLEG): container finished" podID="65471d7d-65b6-49ce-90be-171db9b3cb42" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" exitCode=0 Oct 06 09:07:27 crc kubenswrapper[4991]: I1006 09:07:27.791284 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" event={"ID":"65471d7d-65b6-49ce-90be-171db9b3cb42","Type":"ContainerDied","Data":"6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c"} Oct 06 09:07:27 crc kubenswrapper[4991]: I1006 09:07:27.791348 4991 scope.go:117] "RemoveContainer" containerID="4c7840836bb8b30722fb37ef42a6ddb91588c34148d5ca4e7091454a235364ab" Oct 06 09:07:27 crc kubenswrapper[4991]: I1006 09:07:27.793384 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:07:27 crc kubenswrapper[4991]: E1006 09:07:27.793941 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.051347 4991 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6tscl/must-gather-cfpzb"] Oct 06 09:07:35 crc kubenswrapper[4991]: E1006 09:07:35.052234 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c1a6aa-770b-4250-8d43-cd951f73a1c7" containerName="extract-utilities" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.052253 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c1a6aa-770b-4250-8d43-cd951f73a1c7" containerName="extract-utilities" Oct 06 09:07:35 crc kubenswrapper[4991]: E1006 09:07:35.052270 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c1a6aa-770b-4250-8d43-cd951f73a1c7" containerName="extract-content" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.052278 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c1a6aa-770b-4250-8d43-cd951f73a1c7" containerName="extract-content" Oct 06 09:07:35 crc kubenswrapper[4991]: E1006 09:07:35.052335 4991 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80c1a6aa-770b-4250-8d43-cd951f73a1c7" containerName="registry-server" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.052344 4991 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c1a6aa-770b-4250-8d43-cd951f73a1c7" containerName="registry-server" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.052526 4991 memory_manager.go:354] "RemoveStaleState removing state" podUID="80c1a6aa-770b-4250-8d43-cd951f73a1c7" containerName="registry-server" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.053465 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6tscl/must-gather-cfpzb" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.055260 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6tscl"/"kube-root-ca.crt" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.055554 4991 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6tscl"/"openshift-service-ca.crt" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.068110 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6tscl/must-gather-cfpzb"] Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.190533 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3acfddb3-dfc8-47ea-b46b-0849059f8247-must-gather-output\") pod \"must-gather-cfpzb\" (UID: \"3acfddb3-dfc8-47ea-b46b-0849059f8247\") " pod="openshift-must-gather-6tscl/must-gather-cfpzb" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.190610 4991 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5jls\" (UniqueName: \"kubernetes.io/projected/3acfddb3-dfc8-47ea-b46b-0849059f8247-kube-api-access-q5jls\") pod \"must-gather-cfpzb\" (UID: \"3acfddb3-dfc8-47ea-b46b-0849059f8247\") " pod="openshift-must-gather-6tscl/must-gather-cfpzb" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.292018 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3acfddb3-dfc8-47ea-b46b-0849059f8247-must-gather-output\") pod \"must-gather-cfpzb\" (UID: \"3acfddb3-dfc8-47ea-b46b-0849059f8247\") " pod="openshift-must-gather-6tscl/must-gather-cfpzb" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.292086 4991 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5jls\" (UniqueName: \"kubernetes.io/projected/3acfddb3-dfc8-47ea-b46b-0849059f8247-kube-api-access-q5jls\") pod \"must-gather-cfpzb\" (UID: \"3acfddb3-dfc8-47ea-b46b-0849059f8247\") " pod="openshift-must-gather-6tscl/must-gather-cfpzb" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.292590 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3acfddb3-dfc8-47ea-b46b-0849059f8247-must-gather-output\") pod \"must-gather-cfpzb\" (UID: \"3acfddb3-dfc8-47ea-b46b-0849059f8247\") " pod="openshift-must-gather-6tscl/must-gather-cfpzb" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.313122 4991 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5jls\" (UniqueName: \"kubernetes.io/projected/3acfddb3-dfc8-47ea-b46b-0849059f8247-kube-api-access-q5jls\") pod \"must-gather-cfpzb\" (UID: \"3acfddb3-dfc8-47ea-b46b-0849059f8247\") " pod="openshift-must-gather-6tscl/must-gather-cfpzb" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.374508 4991 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6tscl/must-gather-cfpzb" Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.823879 4991 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6tscl/must-gather-cfpzb"] Oct 06 09:07:35 crc kubenswrapper[4991]: I1006 09:07:35.870737 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6tscl/must-gather-cfpzb" event={"ID":"3acfddb3-dfc8-47ea-b46b-0849059f8247","Type":"ContainerStarted","Data":"1407182e61000d14cad69e059e8acd6c0a3ce5c1eedf92cefd5effcf3426a47c"} Oct 06 09:07:40 crc kubenswrapper[4991]: I1006 09:07:40.905781 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6tscl/must-gather-cfpzb" event={"ID":"3acfddb3-dfc8-47ea-b46b-0849059f8247","Type":"ContainerStarted","Data":"0485e5cb0a1961a784829eec24f985ce959684665333bfef581e202762439697"} Oct 06 09:07:40 crc kubenswrapper[4991]: I1006 09:07:40.906311 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6tscl/must-gather-cfpzb" event={"ID":"3acfddb3-dfc8-47ea-b46b-0849059f8247","Type":"ContainerStarted","Data":"e32cb9b28bf433494ddca48e1b60b93376ea05604a9f7b95305917ad829c10b9"} Oct 06 09:07:40 crc kubenswrapper[4991]: I1006 09:07:40.942321 4991 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6tscl/must-gather-cfpzb" podStartSLOduration=1.906748866 podStartE2EDuration="5.942289528s" podCreationTimestamp="2025-10-06 09:07:35 +0000 UTC" firstStartedPulling="2025-10-06 09:07:35.833819532 +0000 UTC m=+2907.571569553" lastFinishedPulling="2025-10-06 09:07:39.869360184 +0000 UTC m=+2911.607110215" observedRunningTime="2025-10-06 09:07:40.928314981 +0000 UTC m=+2912.666065022" watchObservedRunningTime="2025-10-06 09:07:40.942289528 +0000 UTC m=+2912.680039549" Oct 06 09:07:42 crc kubenswrapper[4991]: I1006 09:07:42.243957 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:07:42 crc kubenswrapper[4991]: E1006 09:07:42.244524 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:07:53 crc kubenswrapper[4991]: I1006 09:07:53.244401 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:07:53 crc kubenswrapper[4991]: E1006 09:07:53.245393 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:08:04 crc kubenswrapper[4991]: I1006 09:08:04.243599 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:08:04 crc kubenswrapper[4991]: E1006 09:08:04.244349 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:08:15 crc kubenswrapper[4991]: I1006 09:08:15.244371 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:08:15 crc kubenswrapper[4991]: E1006 09:08:15.245497 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:08:26 crc kubenswrapper[4991]: I1006 09:08:26.244498 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:08:26 crc kubenswrapper[4991]: E1006 09:08:26.245568 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:08:31 crc kubenswrapper[4991]: I1006 09:08:31.609862 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh_55afad66-4c00-4f7b-bb4a-7cc0eb6c6742/util/0.log" Oct 06 09:08:31 crc kubenswrapper[4991]: I1006 09:08:31.784908 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh_55afad66-4c00-4f7b-bb4a-7cc0eb6c6742/pull/0.log" Oct 06 09:08:31 crc kubenswrapper[4991]: I1006 09:08:31.803435 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh_55afad66-4c00-4f7b-bb4a-7cc0eb6c6742/util/0.log" Oct 06 09:08:31 crc kubenswrapper[4991]: I1006 09:08:31.805436 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh_55afad66-4c00-4f7b-bb4a-7cc0eb6c6742/pull/0.log" Oct 06 09:08:31 crc kubenswrapper[4991]: I1006 09:08:31.996411 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh_55afad66-4c00-4f7b-bb4a-7cc0eb6c6742/util/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.004583 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh_55afad66-4c00-4f7b-bb4a-7cc0eb6c6742/extract/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.036587 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_40bdb12e1a512471e97134e9717163b7b65c7d38f4e9245ae87ed61b65jzclh_55afad66-4c00-4f7b-bb4a-7cc0eb6c6742/pull/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.201064 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-jlsb9_605ba4cf-892d-451c-af8d-a6863c67898d/kube-rbac-proxy/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.238953 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5f7c849b98-jlsb9_605ba4cf-892d-451c-af8d-a6863c67898d/manager/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.278491 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-wrsh9_a85c6bdb-d40d-428f-8f1e-0001b8dd34f7/kube-rbac-proxy/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.410228 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-wrsh9_a85c6bdb-d40d-428f-8f1e-0001b8dd34f7/manager/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.427605 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-nsr9g_ea26b29a-2a8d-4f43-8471-8f875d278b8f/kube-rbac-proxy/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.468081 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-nsr9g_ea26b29a-2a8d-4f43-8471-8f875d278b8f/manager/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.609942 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-25vl9_a69a6896-7855-4b15-b0b9-e26f87ad2864/kube-rbac-proxy/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.700972 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5568b5d68-25vl9_a69a6896-7855-4b15-b0b9-e26f87ad2864/manager/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.775881 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-94qvn_2fcb483a-426d-49ef-9126-c5c8e4ff3a17/kube-rbac-proxy/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.864560 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-8f58bc9db-94qvn_2fcb483a-426d-49ef-9126-c5c8e4ff3a17/manager/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.894050 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-dnp2m_9315f646-15d5-4638-8f93-2b7ee013a548/kube-rbac-proxy/0.log" Oct 06 09:08:32 crc kubenswrapper[4991]: I1006 09:08:32.991400 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54876c876f-dnp2m_9315f646-15d5-4638-8f93-2b7ee013a548/manager/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.096838 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-phjxk_25827c7f-a146-4c7c-900f-670c747d6a15/kube-rbac-proxy/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.176535 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-phjxk_25827c7f-a146-4c7c-900f-670c747d6a15/manager/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.243652 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-h2655_38df1b74-dd97-43b9-a172-93ca631f8467/kube-rbac-proxy/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.305481 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-699b87f775-h2655_38df1b74-dd97-43b9-a172-93ca631f8467/manager/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.409023 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-v7mb5_54e0400f-2429-4520-9d49-82915611ff23/kube-rbac-proxy/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.501313 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-655d88ccb9-v7mb5_54e0400f-2429-4520-9d49-82915611ff23/manager/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.563439 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-klmdp_65e5f035-09a2-492c-8474-9b6441c345a3/kube-rbac-proxy/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.622428 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-klmdp_65e5f035-09a2-492c-8474-9b6441c345a3/manager/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.740731 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z_db7afb25-5117-448c-aa10-aaad9f53b2d2/kube-rbac-proxy/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.768462 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-t5w8z_db7afb25-5117-448c-aa10-aaad9f53b2d2/manager/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.900566 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-wzcnb_d187cd97-0019-488f-9f51-339b4ee5c699/kube-rbac-proxy/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.935558 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-wzcnb_d187cd97-0019-488f-9f51-339b4ee5c699/manager/0.log" Oct 06 09:08:33 crc kubenswrapper[4991]: I1006 09:08:33.996893 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-qrvjh_ead03587-67bc-428d-a356-b00483d82715/kube-rbac-proxy/0.log" Oct 06 09:08:34 crc kubenswrapper[4991]: I1006 09:08:34.101815 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-vf4gj_4f7769cb-2786-4ff1-8991-b8d073a47967/kube-rbac-proxy/0.log" Oct 06 09:08:34 crc kubenswrapper[4991]: I1006 09:08:34.183986 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-qrvjh_ead03587-67bc-428d-a356-b00483d82715/manager/0.log" Oct 06 09:08:34 crc kubenswrapper[4991]: I1006 09:08:34.187034 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-vf4gj_4f7769cb-2786-4ff1-8991-b8d073a47967/manager/0.log" Oct 06 09:08:34 crc kubenswrapper[4991]: I1006 09:08:34.372727 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg_4dd0865e-4068-4db1-a2ae-a854d69d0367/manager/0.log" Oct 06 09:08:34 crc kubenswrapper[4991]: I1006 09:08:34.395957 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c6qvcg_4dd0865e-4068-4db1-a2ae-a854d69d0367/kube-rbac-proxy/0.log" Oct 06 09:08:34 crc kubenswrapper[4991]: I1006 09:08:34.492509 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7c985df74c-bwr96_c317e377-2640-4117-a225-bb65849d42d0/kube-rbac-proxy/0.log" Oct 06 09:08:34 crc kubenswrapper[4991]: I1006 09:08:34.565876 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-856746bff4-lbshg_efe6525f-b400-4474-b6c6-d26c4ab8a43c/kube-rbac-proxy/0.log" Oct 06 09:08:34 crc kubenswrapper[4991]: I1006 09:08:34.795827 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-856746bff4-lbshg_efe6525f-b400-4474-b6c6-d26c4ab8a43c/operator/0.log" Oct 06 09:08:34 crc kubenswrapper[4991]: I1006 09:08:34.886652 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tzrrq_3fa73b60-5381-42ef-be66-f254fb2b80a1/registry-server/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.008784 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-29dxq_3b87bd2d-2bf2-47a4-beba-7fa9e33b0a60/kube-rbac-proxy/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.188838 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-mqlt4_1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f/kube-rbac-proxy/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.214870 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-579449c7d5-29dxq_3b87bd2d-2bf2-47a4-beba-7fa9e33b0a60/manager/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.315158 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7c985df74c-bwr96_c317e377-2640-4117-a225-bb65849d42d0/manager/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.386745 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-mqlt4_1d68b4d1-c1c2-4fe4-a20d-794a3fab3c7f/manager/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.478365 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-zfs6k_cd0a6c35-bd04-4ce8-8b61-94fe7ae169b6/operator/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.478958 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-tq9r2_94e75499-3011-4f29-9c5c-9a5cbea7d10f/kube-rbac-proxy/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.584827 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-tq9r2_94e75499-3011-4f29-9c5c-9a5cbea7d10f/manager/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.675931 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-nhl5c_a048641a-f1d7-4abc-a26f-1537cad412ec/kube-rbac-proxy/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.732744 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-nhl5c_a048641a-f1d7-4abc-a26f-1537cad412ec/manager/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.783391 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-q5fql_b503d08d-eaa0-4987-93e1-099a4ea00450/kube-rbac-proxy/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.859950 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-q5fql_b503d08d-eaa0-4987-93e1-099a4ea00450/manager/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.939555 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-pghn2_04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f/kube-rbac-proxy/0.log" Oct 06 09:08:35 crc kubenswrapper[4991]: I1006 09:08:35.950754 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-pghn2_04fc08a6-3ef5-44fb-bd34-c1b8bf0aa68f/manager/0.log" Oct 06 09:08:37 crc kubenswrapper[4991]: I1006 09:08:37.244040 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:08:37 crc kubenswrapper[4991]: E1006 09:08:37.244411 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:08:50 crc kubenswrapper[4991]: I1006 09:08:50.244122 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:08:50 crc kubenswrapper[4991]: E1006 09:08:50.244725 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:08:50 crc kubenswrapper[4991]: I1006 09:08:50.790552 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-k2rjt_f65e5c1d-a9f2-4954-b72e-27f2d2895ac0/control-plane-machine-set-operator/0.log" Oct 06 09:08:50 crc kubenswrapper[4991]: I1006 09:08:50.981787 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vr6sj_8a8266da-ca7f-4357-8aa7-86aaa7fb23c6/kube-rbac-proxy/0.log" Oct 06 09:08:51 crc kubenswrapper[4991]: I1006 09:08:51.012697 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vr6sj_8a8266da-ca7f-4357-8aa7-86aaa7fb23c6/machine-api-operator/0.log" Oct 06 09:09:02 crc kubenswrapper[4991]: I1006 09:09:02.820612 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-7n4jj_46ef1459-4df8-4a30-b8c9-5b0b26f4f5a1/cert-manager-controller/0.log" Oct 06 09:09:03 crc kubenswrapper[4991]: I1006 09:09:03.096765 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-k9pnq_991468e7-6b42-4164-9b94-14a34c770f48/cert-manager-cainjector/0.log" Oct 06 09:09:03 crc kubenswrapper[4991]: I1006 09:09:03.213881 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-f84wg_a80db63a-219c-446f-aa49-748b3e9c9c38/cert-manager-webhook/0.log" Oct 06 09:09:05 crc kubenswrapper[4991]: I1006 09:09:05.244012 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:09:05 crc kubenswrapper[4991]: E1006 09:09:05.244218 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:09:14 crc kubenswrapper[4991]: I1006 09:09:14.777517 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-m6d2w_03389c9a-9320-4556-8ddb-77e061a1a6c8/nmstate-console-plugin/0.log" Oct 06 09:09:14 crc kubenswrapper[4991]: I1006 09:09:14.958080 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jzjv8_7a7a3abc-344b-429e-a4eb-d62138e60de4/nmstate-handler/0.log" Oct 06 09:09:14 crc kubenswrapper[4991]: I1006 09:09:14.996502 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-2l665_878e15d5-5337-4289-b425-82955b0b6b38/kube-rbac-proxy/0.log" Oct 06 09:09:15 crc kubenswrapper[4991]: I1006 09:09:15.032425 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-2l665_878e15d5-5337-4289-b425-82955b0b6b38/nmstate-metrics/0.log" Oct 06 09:09:15 crc kubenswrapper[4991]: I1006 09:09:15.144193 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-mx8tg_7c58f931-7306-45f7-a983-134a70c9952a/nmstate-operator/0.log" Oct 06 09:09:15 crc kubenswrapper[4991]: I1006 09:09:15.195951 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-2nqhm_2c7406a6-af30-4f22-b109-9ea7e8cc2efe/nmstate-webhook/0.log" Oct 06 09:09:17 crc kubenswrapper[4991]: I1006 09:09:17.244385 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:09:17 crc kubenswrapper[4991]: E1006 09:09:17.245091 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:09:27 crc kubenswrapper[4991]: I1006 09:09:27.811388 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-gh7zn_02de489e-94ff-4cb3-b3c2-7c45f6b64f33/kube-rbac-proxy/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.057212 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/cp-frr-files/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.123464 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-gh7zn_02de489e-94ff-4cb3-b3c2-7c45f6b64f33/controller/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.220777 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/cp-reloader/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.245079 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/cp-frr-files/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.272323 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/cp-metrics/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.299378 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/cp-reloader/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.455224 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/cp-reloader/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.455274 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/cp-metrics/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.462435 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/cp-frr-files/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.470547 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/cp-metrics/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.662894 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/cp-frr-files/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.684742 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/cp-metrics/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.684787 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/cp-reloader/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.717724 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/controller/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.903849 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/frr-metrics/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.933972 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/kube-rbac-proxy/0.log" Oct 06 09:09:28 crc kubenswrapper[4991]: I1006 09:09:28.961877 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/kube-rbac-proxy-frr/0.log" Oct 06 09:09:29 crc kubenswrapper[4991]: I1006 09:09:29.118183 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/reloader/0.log" Oct 06 09:09:29 crc kubenswrapper[4991]: I1006 09:09:29.170756 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-4w7ls_a134e93b-350e-4bfe-9f9f-b743a7b256c6/frr-k8s-webhook-server/0.log" Oct 06 09:09:29 crc kubenswrapper[4991]: I1006 09:09:29.351989 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-647564c7bd-wrdk2_757d57fb-94e6-41dc-8266-38149c0e932a/manager/0.log" Oct 06 09:09:29 crc kubenswrapper[4991]: I1006 09:09:29.531903 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-97d9b8f56-6282x_00076896-8a73-4599-a610-ce6dca6e6495/webhook-server/0.log" Oct 06 09:09:29 crc kubenswrapper[4991]: I1006 09:09:29.611400 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zmsbl_3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb/kube-rbac-proxy/0.log" Oct 06 09:09:29 crc kubenswrapper[4991]: I1006 09:09:29.826160 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-gb555_a1c2c84a-e1a5-4bad-b383-83790b446262/frr/0.log" Oct 06 09:09:30 crc kubenswrapper[4991]: I1006 09:09:30.026850 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zmsbl_3cd0c363-f3b8-4c9e-9a1c-c2e7902133bb/speaker/0.log" Oct 06 09:09:32 crc kubenswrapper[4991]: I1006 09:09:32.243712 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:09:32 crc kubenswrapper[4991]: E1006 09:09:32.245680 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:09:41 crc kubenswrapper[4991]: I1006 09:09:41.720070 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk_8e1b7145-64f5-4381-86f7-23a8b1bb16ec/util/0.log" Oct 06 09:09:41 crc kubenswrapper[4991]: I1006 09:09:41.890170 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk_8e1b7145-64f5-4381-86f7-23a8b1bb16ec/util/0.log" Oct 06 09:09:41 crc kubenswrapper[4991]: I1006 09:09:41.916275 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk_8e1b7145-64f5-4381-86f7-23a8b1bb16ec/pull/0.log" Oct 06 09:09:41 crc kubenswrapper[4991]: I1006 09:09:41.942680 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk_8e1b7145-64f5-4381-86f7-23a8b1bb16ec/pull/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.089664 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk_8e1b7145-64f5-4381-86f7-23a8b1bb16ec/util/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.102347 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk_8e1b7145-64f5-4381-86f7-23a8b1bb16ec/extract/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.116576 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb692mqvk_8e1b7145-64f5-4381-86f7-23a8b1bb16ec/pull/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.230682 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct_88652a70-4e73-407d-a897-e9a6613a7fc8/util/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.420567 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct_88652a70-4e73-407d-a897-e9a6613a7fc8/util/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.421515 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct_88652a70-4e73-407d-a897-e9a6613a7fc8/pull/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.446254 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct_88652a70-4e73-407d-a897-e9a6613a7fc8/pull/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.605444 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct_88652a70-4e73-407d-a897-e9a6613a7fc8/pull/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.610846 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct_88652a70-4e73-407d-a897-e9a6613a7fc8/util/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.635995 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2wrcct_88652a70-4e73-407d-a897-e9a6613a7fc8/extract/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.797688 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jzqrc_7e91b878-dd79-4d4f-8e3c-8ef2cea97e04/extract-utilities/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.950646 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jzqrc_7e91b878-dd79-4d4f-8e3c-8ef2cea97e04/extract-content/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.966574 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jzqrc_7e91b878-dd79-4d4f-8e3c-8ef2cea97e04/extract-utilities/0.log" Oct 06 09:09:42 crc kubenswrapper[4991]: I1006 09:09:42.975180 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jzqrc_7e91b878-dd79-4d4f-8e3c-8ef2cea97e04/extract-content/0.log" Oct 06 09:09:43 crc kubenswrapper[4991]: I1006 09:09:43.147137 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jzqrc_7e91b878-dd79-4d4f-8e3c-8ef2cea97e04/extract-utilities/0.log" Oct 06 09:09:43 crc kubenswrapper[4991]: I1006 09:09:43.167428 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jzqrc_7e91b878-dd79-4d4f-8e3c-8ef2cea97e04/extract-content/0.log" Oct 06 09:09:43 crc kubenswrapper[4991]: I1006 09:09:43.243383 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:09:43 crc kubenswrapper[4991]: E1006 09:09:43.243739 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:09:43 crc kubenswrapper[4991]: I1006 09:09:43.364373 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfklb_be81a987-5591-4f8a-ae8c-1fda1597892e/extract-utilities/0.log" Oct 06 09:09:43 crc kubenswrapper[4991]: I1006 09:09:43.534282 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jzqrc_7e91b878-dd79-4d4f-8e3c-8ef2cea97e04/registry-server/0.log" Oct 06 09:09:43 crc kubenswrapper[4991]: I1006 09:09:43.549190 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfklb_be81a987-5591-4f8a-ae8c-1fda1597892e/extract-utilities/0.log" Oct 06 09:09:43 crc kubenswrapper[4991]: I1006 09:09:43.549194 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfklb_be81a987-5591-4f8a-ae8c-1fda1597892e/extract-content/0.log" Oct 06 09:09:43 crc kubenswrapper[4991]: I1006 09:09:43.694073 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfklb_be81a987-5591-4f8a-ae8c-1fda1597892e/extract-content/0.log" Oct 06 09:09:43 crc kubenswrapper[4991]: I1006 09:09:43.794005 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfklb_be81a987-5591-4f8a-ae8c-1fda1597892e/extract-content/0.log" Oct 06 09:09:43 crc kubenswrapper[4991]: I1006 09:09:43.802870 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfklb_be81a987-5591-4f8a-ae8c-1fda1597892e/extract-utilities/0.log" Oct 06 09:09:44 crc kubenswrapper[4991]: I1006 09:09:44.030613 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82_8d34a20f-3314-4dd9-aa32-75c53762962e/util/0.log" Oct 06 09:09:44 crc kubenswrapper[4991]: I1006 09:09:44.213716 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82_8d34a20f-3314-4dd9-aa32-75c53762962e/pull/0.log" Oct 06 09:09:44 crc kubenswrapper[4991]: I1006 09:09:44.226255 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82_8d34a20f-3314-4dd9-aa32-75c53762962e/pull/0.log" Oct 06 09:09:44 crc kubenswrapper[4991]: I1006 09:09:44.387475 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82_8d34a20f-3314-4dd9-aa32-75c53762962e/util/0.log" Oct 06 09:09:44 crc kubenswrapper[4991]: I1006 09:09:44.411312 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hfklb_be81a987-5591-4f8a-ae8c-1fda1597892e/registry-server/0.log" Oct 06 09:09:44 crc kubenswrapper[4991]: I1006 09:09:44.500369 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82_8d34a20f-3314-4dd9-aa32-75c53762962e/util/0.log" Oct 06 09:09:44 crc kubenswrapper[4991]: I1006 09:09:44.507240 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82_8d34a20f-3314-4dd9-aa32-75c53762962e/pull/0.log" Oct 06 09:09:44 crc kubenswrapper[4991]: I1006 09:09:44.560218 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ctxl82_8d34a20f-3314-4dd9-aa32-75c53762962e/extract/0.log" Oct 06 09:09:44 crc kubenswrapper[4991]: I1006 09:09:44.662149 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nqf5k_4058fb1d-9049-488e-bf00-25f59b04c065/marketplace-operator/0.log" Oct 06 09:09:44 crc kubenswrapper[4991]: I1006 09:09:44.845492 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l9kb2_938e0ad9-f781-4d0c-be52-67939a233f2f/extract-utilities/0.log" Oct 06 09:09:45 crc kubenswrapper[4991]: I1006 09:09:45.000313 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l9kb2_938e0ad9-f781-4d0c-be52-67939a233f2f/extract-content/0.log" Oct 06 09:09:45 crc kubenswrapper[4991]: I1006 09:09:45.019102 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l9kb2_938e0ad9-f781-4d0c-be52-67939a233f2f/extract-utilities/0.log" Oct 06 09:09:45 crc kubenswrapper[4991]: I1006 09:09:45.069855 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l9kb2_938e0ad9-f781-4d0c-be52-67939a233f2f/extract-content/0.log" Oct 06 09:09:45 crc kubenswrapper[4991]: I1006 09:09:45.246665 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l9kb2_938e0ad9-f781-4d0c-be52-67939a233f2f/extract-utilities/0.log" Oct 06 09:09:45 crc kubenswrapper[4991]: I1006 09:09:45.257306 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l9kb2_938e0ad9-f781-4d0c-be52-67939a233f2f/extract-content/0.log" Oct 06 09:09:45 crc kubenswrapper[4991]: I1006 09:09:45.326291 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p9vs2_094a7c3a-f150-42f1-bc2b-5e53b2565058/extract-utilities/0.log" Oct 06 09:09:45 crc kubenswrapper[4991]: I1006 09:09:45.386901 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l9kb2_938e0ad9-f781-4d0c-be52-67939a233f2f/registry-server/0.log" Oct 06 09:09:45 crc kubenswrapper[4991]: I1006 09:09:45.532801 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p9vs2_094a7c3a-f150-42f1-bc2b-5e53b2565058/extract-utilities/0.log" Oct 06 09:09:45 crc kubenswrapper[4991]: I1006 09:09:45.535789 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p9vs2_094a7c3a-f150-42f1-bc2b-5e53b2565058/extract-content/0.log" Oct 06 09:09:45 crc kubenswrapper[4991]: I1006 09:09:45.540834 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p9vs2_094a7c3a-f150-42f1-bc2b-5e53b2565058/extract-content/0.log" Oct 06 09:09:45 crc kubenswrapper[4991]: I1006 09:09:45.665541 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p9vs2_094a7c3a-f150-42f1-bc2b-5e53b2565058/extract-content/0.log" Oct 06 09:09:45 crc kubenswrapper[4991]: I1006 09:09:45.681039 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p9vs2_094a7c3a-f150-42f1-bc2b-5e53b2565058/extract-utilities/0.log" Oct 06 09:09:46 crc kubenswrapper[4991]: I1006 09:09:46.078193 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p9vs2_094a7c3a-f150-42f1-bc2b-5e53b2565058/registry-server/0.log" Oct 06 09:09:56 crc kubenswrapper[4991]: I1006 09:09:56.243612 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:09:56 crc kubenswrapper[4991]: E1006 09:09:56.244531 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:10:07 crc kubenswrapper[4991]: I1006 09:10:07.248338 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:10:07 crc kubenswrapper[4991]: E1006 09:10:07.248921 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:10:18 crc kubenswrapper[4991]: I1006 09:10:18.243954 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:10:18 crc kubenswrapper[4991]: E1006 09:10:18.244797 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:10:30 crc kubenswrapper[4991]: I1006 09:10:30.244502 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:10:30 crc kubenswrapper[4991]: E1006 09:10:30.245532 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:10:41 crc kubenswrapper[4991]: I1006 09:10:41.248845 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:10:41 crc kubenswrapper[4991]: E1006 09:10:41.249866 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:10:46 crc kubenswrapper[4991]: I1006 09:10:46.314011 4991 generic.go:334] "Generic (PLEG): container finished" podID="3acfddb3-dfc8-47ea-b46b-0849059f8247" containerID="e32cb9b28bf433494ddca48e1b60b93376ea05604a9f7b95305917ad829c10b9" exitCode=0 Oct 06 09:10:46 crc kubenswrapper[4991]: I1006 09:10:46.314431 4991 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6tscl/must-gather-cfpzb" event={"ID":"3acfddb3-dfc8-47ea-b46b-0849059f8247","Type":"ContainerDied","Data":"e32cb9b28bf433494ddca48e1b60b93376ea05604a9f7b95305917ad829c10b9"} Oct 06 09:10:46 crc kubenswrapper[4991]: I1006 09:10:46.314897 4991 scope.go:117] "RemoveContainer" containerID="e32cb9b28bf433494ddca48e1b60b93376ea05604a9f7b95305917ad829c10b9" Oct 06 09:10:46 crc kubenswrapper[4991]: I1006 09:10:46.980526 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6tscl_must-gather-cfpzb_3acfddb3-dfc8-47ea-b46b-0849059f8247/gather/0.log" Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.018039 4991 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6tscl/must-gather-cfpzb"] Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.018846 4991 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6tscl/must-gather-cfpzb" podUID="3acfddb3-dfc8-47ea-b46b-0849059f8247" containerName="copy" containerID="cri-o://0485e5cb0a1961a784829eec24f985ce959684665333bfef581e202762439697" gracePeriod=2 Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.028250 4991 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6tscl/must-gather-cfpzb"] Oct 06 09:10:54 crc kubenswrapper[4991]: E1006 09:10:54.094725 4991 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3acfddb3_dfc8_47ea_b46b_0849059f8247.slice/crio-0485e5cb0a1961a784829eec24f985ce959684665333bfef581e202762439697.scope\": RecentStats: unable to find data in memory cache]" Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.244542 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:10:54 crc kubenswrapper[4991]: E1006 09:10:54.245251 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.373464 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6tscl_must-gather-cfpzb_3acfddb3-dfc8-47ea-b46b-0849059f8247/copy/0.log" Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.374079 4991 generic.go:334] "Generic (PLEG): container finished" podID="3acfddb3-dfc8-47ea-b46b-0849059f8247" containerID="0485e5cb0a1961a784829eec24f985ce959684665333bfef581e202762439697" exitCode=143 Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.374137 4991 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1407182e61000d14cad69e059e8acd6c0a3ce5c1eedf92cefd5effcf3426a47c" Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.400225 4991 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6tscl_must-gather-cfpzb_3acfddb3-dfc8-47ea-b46b-0849059f8247/copy/0.log" Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.400565 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6tscl/must-gather-cfpzb" Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.550308 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3acfddb3-dfc8-47ea-b46b-0849059f8247-must-gather-output\") pod \"3acfddb3-dfc8-47ea-b46b-0849059f8247\" (UID: \"3acfddb3-dfc8-47ea-b46b-0849059f8247\") " Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.550402 4991 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5jls\" (UniqueName: \"kubernetes.io/projected/3acfddb3-dfc8-47ea-b46b-0849059f8247-kube-api-access-q5jls\") pod \"3acfddb3-dfc8-47ea-b46b-0849059f8247\" (UID: \"3acfddb3-dfc8-47ea-b46b-0849059f8247\") " Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.558654 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3acfddb3-dfc8-47ea-b46b-0849059f8247-kube-api-access-q5jls" (OuterVolumeSpecName: "kube-api-access-q5jls") pod "3acfddb3-dfc8-47ea-b46b-0849059f8247" (UID: "3acfddb3-dfc8-47ea-b46b-0849059f8247"). InnerVolumeSpecName "kube-api-access-q5jls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.631746 4991 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3acfddb3-dfc8-47ea-b46b-0849059f8247-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3acfddb3-dfc8-47ea-b46b-0849059f8247" (UID: "3acfddb3-dfc8-47ea-b46b-0849059f8247"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.652369 4991 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5jls\" (UniqueName: \"kubernetes.io/projected/3acfddb3-dfc8-47ea-b46b-0849059f8247-kube-api-access-q5jls\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:54 crc kubenswrapper[4991]: I1006 09:10:54.652409 4991 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3acfddb3-dfc8-47ea-b46b-0849059f8247-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 06 09:10:55 crc kubenswrapper[4991]: I1006 09:10:55.259908 4991 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3acfddb3-dfc8-47ea-b46b-0849059f8247" path="/var/lib/kubelet/pods/3acfddb3-dfc8-47ea-b46b-0849059f8247/volumes" Oct 06 09:10:55 crc kubenswrapper[4991]: I1006 09:10:55.381919 4991 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6tscl/must-gather-cfpzb" Oct 06 09:11:05 crc kubenswrapper[4991]: I1006 09:11:05.244134 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:11:05 crc kubenswrapper[4991]: E1006 09:11:05.244913 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:11:17 crc kubenswrapper[4991]: I1006 09:11:17.244341 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:11:17 crc kubenswrapper[4991]: E1006 09:11:17.245143 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:11:31 crc kubenswrapper[4991]: I1006 09:11:31.243514 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:11:31 crc kubenswrapper[4991]: E1006 09:11:31.244182 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:11:44 crc kubenswrapper[4991]: I1006 09:11:44.243935 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:11:44 crc kubenswrapper[4991]: E1006 09:11:44.244623 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:11:55 crc kubenswrapper[4991]: I1006 09:11:55.243826 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:11:55 crc kubenswrapper[4991]: E1006 09:11:55.244734 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42" Oct 06 09:12:10 crc kubenswrapper[4991]: I1006 09:12:10.243711 4991 scope.go:117] "RemoveContainer" containerID="6a6b3415f73a4c0c41210eabb7538f4a4466b69ae0783f03af73089665cb999c" Oct 06 09:12:10 crc kubenswrapper[4991]: E1006 09:12:10.244523 4991 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wpb6m_openshift-machine-config-operator(65471d7d-65b6-49ce-90be-171db9b3cb42)\"" pod="openshift-machine-config-operator/machine-config-daemon-wpb6m" podUID="65471d7d-65b6-49ce-90be-171db9b3cb42"